L2 Regularization is known as Ridge Regression. Regularization is done to prevent overfitting. In L2 regularization the less important features will not become zero.
When we look at model complexity as a function of weights, we can see that the complexity of a feature is proportional to its weight's absolute value.
Z = w1q1 + w2q2+ w3q3+....+wnqn
L2 regularization terms :
(w1)2+ (w2)2+.........+ (wn)2
Weights are pushed toward zero by L2 regularization, but they are not made exactly zero. At each iteration, L2 regularization operates as a force that removes a small percentage of weights. As a result, weights will never be zero.
L2 is used to reduce overfitting of the model and is one of the regularization techniques.
Neural Networks are deployed to draw patterns from huge data, Neural network has input and output layers , weights and bias value and can have multiple hidden layers. The more the layers in a network the more complex it becomes and is called a deppe neural network.
Neural network uses
A neural network's weight parameter alters input data in the network's hidden layers.
Each time a neural network is updated, L2 is employed to lessen the weight and update is performed. This way L2 is used to reduce overfitting.