StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Supervised Learning
/
Tree-Based Models
/
Gini vs entropy
← Previous
Next →
189.
Definition of Information Gain
easy
Information gain is used as a splitting criterion in decision trees. How is it computed?
A
The difference in accuracy between the parent node's majority class prediction and the split prediction
B
The Gini impurity of the parent node minus the sum of Gini impurities across all child nodes
C
The ratio of correctly classified samples before and after the split weighted by node size
D
The entropy of the parent node minus the weighted average entropy of the child nodes after the split
Sign in to verify your answer
← Back to Questions