StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Optimization
/
Gradient Methods
/
Convergence
← Previous
Next →
175.
Definition of Convergence
easy
What does convergence mean in the context of gradient descent optimization?
A
It means the parameter updates become negligibly small and the loss stops decreasing meaningfully
B
It means the gradient magnitude reaches its theoretical minimum as determined by the loss function curvature
C
It means the training loss reaches exactly zero and all training samples are correctly predicted
D
It means the validation loss equals the training loss indicating the model has generalized perfectly
Sign in to verify your answer
← Back to Questions