StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Unsupervised Learning
/
Dimensionality Reduction
/
PCA
← Previous
Next →
296.
Feature Standardization Before PCA
easy
Before applying PCA, why is it important to standardize the features?
A
Features with larger scales dominate the variance and would unfairly control the principal components
B
PCA requires all features to follow a normal distribution, and standardization achieves approximate normality
C
Standardization removes correlations between features, making PCA equivalent to independent component analysis
D
Standardization ensures the covariance matrix is positive definite, which is required for eigendecomposition
Sign in to verify your answer
← Back to Questions