StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Supervised Learning
/
Tree-Based Models
/
Why boosting reduces bias
← Previous
Next →
8.
AdaBoost Sample Reweighting
easy
In AdaBoost, how are training samples reweighted after each iteration?
A
Weights are randomly shuffled after each iteration to prevent the ensemble from memorizing training order
B
Misclassified samples receive higher weights so the next learner focuses on harder examples
C
All samples receive equal weights after each iteration to prevent any single learner from dominating
D
Correctly classified samples receive higher weights to reinforce the patterns the ensemble has already learned
Sign in to verify your answer
← Back to Questions