0. Which of the following is true?
In batch gradient descent we update the weights and biases of the neural network after forward pass over each training example.
In batch gradient descent we update the weights and biases of our neural network after forward pass over all the training examples.
Each step of stochastic gradient descent takes more time than each step of batch gradient descent.
None of these three options is correct