StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Math Foundations
/
Calculus
/
Loss functions
← Previous
Next →
421.
KL Divergence as Loss
medium
KL divergence is used as a loss function in variational autoencoders. What does it measure?
A
The symmetric difference between two probability distributions regardless of their ordering or direction
B
The squared difference between the means of two distributions normalized by their combined variance
C
The cross-entropy between two distributions minus the entropy of the target distribution
D
The information lost when using distribution Q to approximate distribution P, penalizing Q for deviating from P
Sign in to verify your answer
← Back to Questions