What is “model overfitting” in machine learning?

What is “model overfitting” in machine learning?

a) When a model performs well on training data but poorly on unseen test data
b) When a model performs equally well on training and test data
c) When a model fails to learn the data patterns
d) When a model underfits the data

Answer:

a) When a model performs well on training data but poorly on unseen test data

Explanation:

Overfitting occurs when a machine learning model performs well on the training data but fails to generalize to new, unseen test data. This happens because the model becomes too complex and starts learning noise and outliers in the training data, rather than general patterns.

Overfitting can be prevented by using regularization techniques such as L1/L2 regularization, pruning, or dropout, and by employing cross-validation to tune the model. Another way to prevent overfitting is to use a simpler model with fewer parameters.

Good models balance complexity and performance by learning the underlying patterns in the data without memorizing the noise, leading to better generalization on new data.

Reference:

Artificial Intelligence MCQ (Multiple Choice Questions)

Scroll to Top