StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Optimization
/
Gradient Methods
/
Gradient descent
← Previous
Next →
336.
Gradient Descent Update Rule
easy
What does gradient descent do at each iteration?
A
It updates parameters by randomly sampling a new point near the current position and accepting it if the loss improves
B
It updates parameters by computing the second derivative of the loss and applying a Newton step
C
It updates parameters by moving in the direction opposite to the gradient of the loss function
D
It updates parameters by moving in the direction of the gradient to maximize the loss function
Sign in to verify your answer
← Back to Questions