StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Regularization & Stability
/
Dropout
← Previous
Next →
241.
Dropout in Convolutional vs Fully Connected Layers
medium
Dropout is less commonly applied to convolutional layers than to fully connected layers. Why?
A
Convolutional layers use batch normalization which conflicts with dropout when both are applied to the same layer
B
Convolutional layers already have fewer parameters than fully connected layers, making additional regularization unnecessary
C
Convolutional layers share weights across locations, so dropping units has less regularizing effect than in fully connected layers
D
Convolutional layers produce spatially correlated activations, causing standard dropout to leave adjacent units correlated
Sign in to verify your answer
← Back to Questions