StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Practice
/
Optimization
/
Gradient Methods
/
Gradient descent
Gradient descent
Batch gradient descent.
0/10
completed
1
2
3
4
5
6
7
8
9
10
medium
Momentum in Gradient Descent
6/10
Momentum is added to gradient descent. What does it do?
A
It averages the gradients over recent iterations to produce a smoother and more reliable update direction
B
It accumulates a velocity vector in directions of persistent gradient, accelerating convergence and dampening oscillations
C
It resets the gradient to zero when the loss plateaus, preventing gradient descent from getting stuck in flat regions
D
It increases the learning rate adaptively based on the curvature of the loss surface at each iteration
Sign in to verify your answer