How does bias and variance error gets introduced ?

Any supervised learning model is the result of optimizing the errors due to model complexity and the training error(prediction error on examples during training). Example: Ridge Regression or Regularized Linear Regression cost function(with parameters ) is given by     is the mean squared error of the prediction made(by the model with parameters ) on training…

What is the bias variance trade-off ?

Bias-variance tradeoff is the tradeoff between training error and the test error. In other words, making the training error the lowest possible, may lead to high generalization error on the test set.The bias-variance tradeoff is a core concept in supervised learning. We want to design models that best fit the training data capturing all the subtleties…

Can you give an example of a classifier with high bias and high variance?

High bias means the data is being  underfit. The decision boundary is not usually complex enough. High variance happens due to over fitting, the decision boundary is more complex than what it should be.   High bias high variance happens when you fit a complex decision boundary that is also not fitting the training set…