L2 Regularization is known as Ridge Regression. Regularization is done to prevent overfitting. In L2 regularization the less important features will not become zero.

When we look at model complexity as a function of weights, we can see that the complexity of a feature is proportional to its weight's absolute value.

Z = w_{1}q_{1} + w_{2}q_{2}+ w_{3}q_{3}+....+w_{n}q_{n}

L2 regularization terms :

(w_{1})^{2}+ (w_{2})^{2}+.........+ (w_{n})^{2}

Weights are pushed toward zero by L2 regularization, but they are not made exactly zero. At each iteration, L2 regularization operates as a force that removes a small percentage of weights. As a result, weights will never be zero.

L2 is used to reduce overfitting of the model and is one of the regularization techniques.

Neural Networks are deployed to draw patterns from huge data, Neural network has input and output layers , weights and bias value and can have multiple hidden layers. The more the layers in a network the more complex it becomes and is called a deppe neural network.

Neural network uses

A neural network's weight parameter alters input data in the network's hidden layers.

Each time a neural network is updated, L2 is employed to lessen the weight and update is performed. This way L2 is used to reduce overfitting.