StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Regularization & Stability
/
Batch normalization
← Previous
Next →
47.
Batch Normalization as Regularization
medium
Batch normalization acts as a regularizer. What is the mechanism behind this effect?
A
Each example's normalization depends on the other examples in its mini-batch, introducing stochastic noise that prevents overfitting
B
Each example's normalization applies a random scale and shift drawn from the learned γ and β distributions
C
Each example's normalization discards absolute activation magnitudes, forcing neurons to encode relative rather than absolute information
D
Each example's normalization zeros out activations below the batch mean, creating implicit sparsity in each layer
Sign in to verify your answer
← Back to Questions