StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Vanishing / exploding gradients
← Previous
Next →
207.
Definition of Vanishing Gradients
easy
What is the vanishing gradient problem in deep neural networks?
A
Gradients become exponentially small as they are propagated backward through many layers, causing early layers to learn very slowly
B
Gradients become exactly zero at saturated neurons, causing those specific neurons to stop updating permanently during the backpropagation process
C
Gradients become negative during backpropagation through skip connections, reversing the update direction for early layers
D
Gradients become constant across layers, causing all layers to update at the same rate regardless of their depth
Sign in to verify your answer
← Back to Questions