L2 norm regularization : Make the weights closer to zero prevent overfitting.

L1 Norm regularization : Make the weights closer to zero and also induce sparsity in weights. Less common form of regularization

Dropout regularization : Ensure some of the hidden units are dropped out at random to ensure the network does not overfit by becoming too reliant on a neuron by letting it overfit

Early stopping : Stop the training before weights are adjusted to overfit to the training data