StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Optimization
/
Gradient Methods
/
Gradient descent
← Previous
Next →
509.
Momentum in Gradient Descent
medium
Momentum is added to gradient descent. What does it do?
A
It averages the gradients over recent iterations to produce a smoother and more reliable update direction
B
It accumulates a velocity vector in directions of persistent gradient, accelerating convergence and dampening oscillations
C
It resets the gradient to zero when the loss plateaus, preventing gradient descent from getting stuck in flat regions
D
It increases the learning rate adaptively based on the curvature of the loss surface at each iteration
Sign in to verify your answer
← Back to Questions