Artificial Neural Networks (ANN): Q.15

0. In training a batch neural network, after running the first few epochs, you notice that the loss does not decrease. The reasons for this could be
1. The learning rate is low.
2. The neural net is stuck in local minima
3. The neural net has too many units in the hidden layer

Cancel reply

Your email address will not be published. Required fields are marked *


Cancel reply

Your email address will not be published. Required fields are marked *