ANN Q6

0. Consider the feedforward neural network in the figure shown below which has one inner layer, two hidden layers and one output layer. The input x to this network ∈ R3 and the number of neurons in the two hidden layers and the output layer is 4, 3, 5 respectively. Each layer is fully connected to the next layer, i.e., there is a weight connecting every neuron in layer i to every neuron in layer i+1. Also note that every neuron in the hidden and output layers has a bias connected to it. The activation function used in the hidden layers is the logistic function as defined in the lecture and the output function the softmax function. Now suppose that all the weights in layer 1 are 0.05 i.e., each the 3*4 = 12 elements of the matrix W1 has a value 0.05. Similarly, let us assume all the weights in layer 2 are 0.025, i.e., each of the 4*3 = 12 elements of the matrix has a value 0.025. Also, let us assume that all the weights in layer 3 are 1.0 i.e., each of the 3*5 = 15 elements of the matrix W3 has a value 1. Finally, the bias vectors for 3 layers are as follows:
b1 = [0.1, 0.2, 0.3, 0.4]
b2 = [5.2, 3.2, 4.3]
b3 = [0.2, 0.45, 0.75, 0.55, 0.95]
Now, suppose we feed the input x = [1.5, 2.5, 3] to this network, what will be the vector of O3 (i.e., the value output by the third neuron in the output layer).

Cancel reply

Your email address will not be published. Required fields are marked *


Cancel reply

Your email address will not be published. Required fields are marked *