Scenario 3
I want to eat a cake and I did cardio and went to the gym yesterday, but it is also not an occasion for cake:
Here, the following applies:
- Xi is the first input factor, I did cardio yesterday. Now, Xi = 1, as this factor is true.
- Wi is the weight of the first input factor, Xi. In our example, Wi = 2.
- Xii is the second input factor, I went to the gym yesterday. Now, Xii = 1, as this factor is true.
- Wii is the weight of the second input factor, Xii. In our example, Wii = 3.
- Xiii is the third input factor, It is an occasion for cake. Now, Xiii = 0, as this factor is false.
- Wiii is the weight of the third input factor, Xiii. In our example, Wiii = 6.
- threshold is 4.
We know that the neuron computes the following equation:
For scenario 3, the equation will translate to this:
This gives us the following equation:
5 ≥ 4 is true, so this fires 1, which means I can eat the cake.
From the preceding three scenarios, we saw how a single artificial neuron works. This single unit is also called a perceptron. A perceptron essentially handles binary inputs, computes the sum, and then compares with a threshold to ultimately give a binary output.
To better appreciate how a perceptron works, we can translate our preceding equation into a more generalized form for the sake of explanation.
Let's assume there is just one input factor, for simplicity:
Let's also assume that threshold = b. Our equation was as follows:
It now becomes this:
It can also be written as , then output 1 else 0.
Here, the following applies:
- w is the weight of the input
- b is the threshold and is referred to as the bias
This rule summarizes how a perceptron neuron works.
Just like the mammalian brain, an ANN is made up of many such perceptions that are stacked and layered together. In the next section, we will get an understanding of how these neurons work together within an ANN.