50. Batch Normalization Mechanism
hard

Batch normalization is applied between a linear layer and its activation function. What does it do to the pre-activation values?