How does bias and variance error gets introduced ?

Any supervised learning model is the result of optimizing the errors due to model complexity and the training error(prediction error on examples during training). Example: Ridge Regression¬†or Regularized Linear Regression cost function(with parameters ) is given by     is the mean squared error of the prediction made(by the model with parameters ) on training…

What is the bias variance trade-off ?

Bias-variance tradeoff is the tradeoff between training error and the test error. In other words,¬†making the training error the lowest possible, may lead to high generalization error on the test set.The bias-variance tradeoff is a core concept in supervised learning. We want to design models that best fit the training data capturing all the subtleties…