What is “feature scaling” in machine learning?

What is “feature scaling” in machine learning?

a) A technique to standardize the range of independent variables in the dataset
b) A method to extract important features from the data
c) A process to increase the size of the dataset
d) A method to reduce the dimensionality of the dataset

Answer:

a) A technique to standardize the range of independent variables in the dataset

Explanation:

Feature scaling is a preprocessing technique used in machine learning to standardize the range of independent variables or features in a dataset. This is important because features with different scales can negatively impact the performance of many machine learning algorithms, especially those based on distance measures like K-nearest neighbors (KNN) or support vector machines (SVMs).

Two common methods for feature scaling are normalization and standardization. Normalization scales features to a range of [0, 1], while standardization rescales features to have a mean of 0 and a standard deviation of 1.

Feature scaling ensures that all features contribute equally to the model’s predictions, leading to faster convergence during training and better model performance.

Reference:

Artificial Intelligence MCQ (Multiple Choice Questions)

Scroll to Top