How does bias and variance error gets introduced ?

Any supervised learning model is the result of optimizing the errors due to model complexity and the training error(prediction error on examples during training). Example: Ridge RegressionĀ or Regularized Linear Regression cost function(with parameters ) is given by     is the mean squared error of the prediction made(by the model with parameters ) on training…

How do you eliminate underfitting ?

Make the model simpler Collect more data Collect more features Increase the regularization parameter Answer – (c) Underfitting is the opposite of overfitting and it occurs when model is too simple to learn from the given dataset. This could happen if right features were not selected or extracted, or the regularization was done with higher…

What is perplexity ? Where do you typically use perplexity ?

Perplexity is a measure used in probabilistic modeling. In NLP it is used to measureĀ  how well the probabilistic model explains the observed data. It is closely related to likelihood, which is the value of the joint probability of the observed data. Suppose the model generates data , then the perplexity can be computed as:…

Suppose you are modeling text with a HMM, What is the complexity of finding most the probable sequence of tags or states from a sequence of text using brute force algorithm?

Assume there are total states and let be the length of the largest sequence. Think how we generate text using an hMM. We first have a state sequence and from each state we emit an output. From each state, any word out of possible outcomes can be generated. Since there are states, at each possible…

How to measure the performance of the language model ?

While building language model, we try to estimate the probability of the sentence or a document. Given sequences(sentences or documents) like     Language model(bigram language model) will be :     for each sequence given by above equation. Once we apply Maximum Likelihood Estimation(MLE), we should have a value for the term . Perplexity…