StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Neural networks
← Previous
Next →
814.
Weight Sharing in Neural Networks
medium
What is weight sharing in a neural network and where is it most commonly used?
A
It copies weights from a pretrained network and freezes them to prevent updates during fine-tuning
B
It averages the weights of multiple independently trained networks to produce an ensemble prediction
C
It reuses the same weights across multiple locations in the input, reducing parameters and encoding spatial invariance
D
It initializes all weights in a layer to the same value to ensure symmetric gradient flow during training
Sign in to verify your answer
← Back to Questions