StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Backpropagation (chain rule intuition)
← Previous
Next →
468.
Local Gradients in Backpropagation
easy
In backpropagation, what is a local gradient?
A
The derivative of a node's output with respect to its inputs, computed independently of the rest of the network
B
The derivative of the activation function evaluated at the layer's pre-activation value during the forward pass
C
The derivative of the loss with respect to a specific layer's weights after combining upstream and local information
D
The derivative of the loss with respect to the network's input, used to analyze input feature importance
Sign in to verify your answer
← Back to Questions