StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Practice
/
Optimization
/
Gradient Methods
/
Gradient descent
Gradient descent
Batch gradient descent.
0/10
completed
1
2
3
4
5
6
7
8
9
10
easy
Gradient Descent Update Rule
1/10
What does gradient descent do at each iteration?
A
It updates parameters by randomly sampling a new point near the current position and accepting it if the loss improves
B
It updates parameters by computing the second derivative of the loss and applying a Newton step
C
It updates parameters by moving in the direction opposite to the gradient of the loss function
D
It updates parameters by moving in the direction of the gradient to maximize the loss function
Sign in to verify your answer