StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Vanishing / exploding gradients
← Previous
Next →
184.
Definition of Exploding Gradients
easy
What is the exploding gradient problem?
A
Gradients grow exponentially large during backpropagation, causing parameter updates to destabilize training
B
Gradients oscillate between large positive and negative values, preventing the optimizer from converging to a stable solution
C
Gradients grow linearly across layers, causing later layers to update faster than earlier layers during training
D
Gradients grow proportionally to the learning rate, causing instability only when the learning rate is set too high
Sign in to verify your answer
← Back to Questions