If the machine learning model is not generalised then the model contains some kind of error.

Error= difference between actual and predicted values/classes

Formulae = sum of (actual output-predicted output), Also **Error is the sum of reducible + irreducible error. **

*Reducible Error= bias + variance*

**Bias** is how far is the predicted values/class from actual values/class. If the predicted value is too far away from actual value then the model is highly biased.

*If values are not too far away then its low biased.*

If the model is Highly biased then it won’t be able to capture the complex data and hence it UNDERFITS. (Underfitting)

If the model performs well on training dataset but does not perform well on testing or validation data which is new to model then its termed as the variance. So variance is how scattered predicted values from the actual values. If the model has High variance than the model overfits (OVERFITTING).

Oftenly termed as the model learned the noise.

### Comments

comments