StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Architectures (Conceptual)
/
RNN basics
← Previous
Next →
696.
RNN Long-Range Dependency Failure
medium
Vanilla RNNs struggle to capture long-range dependencies in sequences. What is the primary reason?
A
Gradients either vanish or explode during backpropagation through time as they are multiplied by the recurrent weight matrix at each step
B
Gradients flow only through the input connections and not through the recurrent connections during backpropagation
C
Gradients are not propagated through the hidden state, preventing early time steps from influencing the final prediction during backpropagation through time
D
Gradients at early time steps are overwritten by later time steps since the hidden state is updated in place
Sign in to verify your answer
← Back to Questions