StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Deep Learning
/
Foundations
/
Perceptron
← Previous
Next →
591.
Perceptron Convergence Theorem
medium
The perceptron convergence theorem guarantees convergence under what condition?
A
The training data is linearly separable — the perceptron finds a separating hyperplane in a finite number of steps
B
The learning rate is sufficiently small — convergence is guaranteed for any dataset given a small enough step size
C
The training data is normalized — standardized inputs ensure the gradient updates remain bounded throughout training
D
The weight initialization is close to the optimal solution — the perceptron converges faster from good starting weights
Sign in to verify your answer
← Back to Questions