-
Notifications
You must be signed in to change notification settings - Fork 0
Bias and Variance Tradeoff
rameshjesswani edited this page Jul 13, 2018
·
1 revision
-
Bias: It is the difference between the estimated value of the model to the actual value ( Bias = Expected value - True value)
-
Variance: Spread of the data points
-
In the Figure, when the predictions of the model are accurate then our error of the model is very low, which results in low variance and bias. All the data points fits in the bulls eye.
-
Similarly we can say that if the variance increases, the spread of our data point increases which results in less accurate prediction. And as the bias increases the error between our predicted value and the observed values increases.
-
Total Error = Bias^2 + Variance
-
We can see as the model complexity(parameters) increases, the variance of the model increases and bias decreases.
-
General Comments
-
- A model with small variance usually has a large bias(its under fitting)
-
- A model with small bias usually has a large variance(its overfitting)
-
- Too many parameters/capacity leads to overfitting, and no enough parameters/capacity leads to underfitting. Therefore it is not really trivial to determine when under/overfitting happens.
-
- Number of parameters/capacity needs to be carafully adjusted in order to "match" the tasks complexity and number of training samples.