1. Losses
With Ui is output, Yi is true value,
L1 Loss:
Loss=|U1−Y1|+..+|Un−Yn|N
L2 Loss:
Loss=(U1−Y1)2+..+(Un−Yn)2N
2. Norms
With Wi is weight,
L1 Norm:
Norm=W1+..+Wn
L2 Norm:
Norm=W21+..+W2n−−−−−−−−−−√
3. Regularisations
L1 Regularisation:
Regularisation=Ratio×L1Norm
L2 Regularisation:
Regularisation=Ratio×L2Norm
4. Bonus: Deviations
L1 Deviation:
L1Variance=|X1−Mean|+..+|Xn−Mean|N
MeanDeviation=L1Variance
L2 Deviation:
L2Variance=(X1−Mean)2+..+(Xn−Mean)2N
StandardDeviation=L2Variance−−−−−−−−−−√
Deviations are important, used in generator in GANs. After training an auto-encoder, use the decoder as generator. In the auto-encoder model, additional deviation layer makes it work better.
No comments:
Post a Comment