Tuesday, 4 February 2020

L1 and L2: Losses, Norms, Regularisations, and Deviations

1. Losses

With Ui is output, Yi is true value,
L1 Loss:
Loss=|U1Y1|+..+|UnYn|N
L2 Loss:
Loss=(U1Y1)2+..+(UnYn)2N

2. Norms

With Wi is weight,
L1 Norm:
Norm=W1+..+Wn
L2 Norm:
Norm=W21+..+W2n

3. Regularisations

L1 Regularisation:
Regularisation=Ratio×L1Norm
L2 Regularisation:
Regularisation=Ratio×L2Norm

4. Bonus: Deviations

L1 Deviation:
L1Variance=|X1Mean|+..+|XnMean|N
MeanDeviation=L1Variance
L2 Deviation:
L2Variance=(X1Mean)2+..+(XnMean)2N
StandardDeviation=L2Variance
Deviations are important, used in generator in GANs. After training an auto-encoder, use the decoder as generator. In the auto-encoder model, additional deviation layer makes it work better.

No comments:

Post a Comment