StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Backpropagation (chain rule intuition)
← Previous
Next →
825.
Why Backpropagation Stores Activations
medium
Backpropagation requires storing intermediate activations from the forward pass. Why?
A
It needs them to compute the Hessian matrix required for second-order optimization during the backward pass
B
It needs them to verify that the forward pass was numerically stable before computing the backward pass gradients
C
It needs them to reconstruct the input from the output, enabling gradient flow through non-invertible operations
D
It needs them to compute local gradients during the backward pass; the gradient depends on the forward activation value
Sign in to verify your answer
← Back to Questions