info@avatto.com
+91-9920808017
0. In training a batch neural network, after running the first few epochs, you notice that the loss does not decrease. The reasons for this could be 1. The learning rate is low. 2. The neural net is stuck in local minima 3. The neural net has too many units in the hidden layer
1 or 2
1 or 3
2 or 3
1 only
Your email address will not be published. Required fields are marked *
Report
Name
Email
Website
Save my name, email, and website in this browser for the next time I comment.
Comment
Login with Facebook
Login with Google
Forgot your password?
Lost your password? Please enter your email address. You will receive mail with link to set new password.
Back to login