StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Optimization
/
Gradient Methods
/
Gradient descent
← Previous
Next →
804.
Vanishing Gradient Problem
medium
What is the vanishing gradient problem in the context of gradient descent for deep networks?
A
Gradients become exponentially small as they propagate backward through many layers, stalling updates in early layers
B
Gradients oscillate between positive and negative values during backpropagation, preventing stable convergence
C
Gradients become exponentially large as they propagate backward through many layers, causing parameter divergence
D
Gradients become zero at flat regions of the loss surface, causing gradient descent to stop updating parameters
Sign in to verify your answer
← Back to Questions