StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Vanishing / exploding gradients
← Previous
Next →
683.
Residual Connections and Gradient Flow
medium
Residual connections (skip connections) in ResNets help address the vanishing gradient problem. How?
A
They normalize the gradient at each layer by dividing by the number of residual blocks traversed during backpropagation
B
They amplify the gradient magnitude at each layer by adding a learned scaling factor to the residual path during the backpropagation process
C
They prevent gradient shrinkage by enforcing that all layer outputs have unit variance through adaptive normalization
D
They create a direct path for gradients to flow from the output back to early layers without passing through all intermediate transformations
Sign in to verify your answer
← Back to Questions