StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Optimization
/
Gradient Methods
/
Gradient descent
← Previous
Next →
701.
Role of Learning Rate
easy
What does the learning rate control in gradient descent?
A
The threshold below which the gradient magnitude is considered negligible and training stops
B
The number of iterations before gradient descent is considered to have converged
C
The step size taken in the direction of the negative gradient at each update
D
The fraction of training samples used to compute the gradient at each update step
Sign in to verify your answer
← Back to Questions