StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Practice
/
Optimization
/
Gradient Methods
/
Gradient descent
Gradient descent
Batch gradient descent.
0/10
completed
1
2
3
4
5
6
7
8
9
10
easy
Role of Learning Rate
2/10
What does the learning rate control in gradient descent?
A
The threshold below which the gradient magnitude is considered negligible and training stops
B
The number of iterations before gradient descent is considered to have converged
C
The step size taken in the direction of the negative gradient at each update
D
The fraction of training samples used to compute the gradient at each update step
Sign in to verify your answer