StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Regularization & Stability
/
Batch normalization
← Previous
Next →
51.
Batch Normalization Operation
easy
What does batch normalization do to the pre-activation values within a mini-batch?
A
It normalizes them to match the distribution of the first mini-batch seen during training as a stable reference
B
It normalizes them by dividing by the running mean accumulated across all previous mini-batches during training
C
It normalizes them to zero mean and unit variance across the batch, then applies learned scale and shift parameters
D
It normalizes them to lie within the range [0, 1] by dividing by the maximum absolute value in the batch
Sign in to verify your answer
← Back to Questions