StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Activation functions
← Previous
Next →
245.
Dying ReLU Problem
medium
The dying ReLU problem occurs during training. What causes it?
A
Neurons receive consistently negative inputs and output zero permanently, causing their gradients to become zero and weights to stop updating
B
Neurons become correlated across the network, causing redundant activations that reduce effective model capacity
C
Neurons output zero for positive inputs due to a sign error in the activation function implementation during the forward pass computation
D
Neurons receive consistently large positive inputs and saturate, causing their gradients to vanish and training to stall
Sign in to verify your answer
← Back to Questions