Tanh Activation (limited negative, limited positive):
Sigmoid Activation (no negative, limited positive):
ReLU (no negative, positive to infinity):
Softplus aka Smooth ReLU (rectifier, smooth value transition):
Basically, depends on what output values should be and an activation function is selected:
- Identity activation: No negative limit, no positive limit.
- Tanh: Limited negative, limited positive.
- Sigmoid: No negative, limited positive.
- ReLU: No negative, limited positive.
- Etc.
Any custom function can be used as activation function, for example, a function similar to Tanh:
f(x) = [ sign(x)*(abs(x) - ln(cosh(x))) ] / 0.7
Softsign is a sigmoid-like activation function that utilise sign(x) and abs(x) too:
softsign(x) = x / (abs(x)+1)
Softsign is a sigmoid-like activation function that utilise sign(x) and abs(x) too:
softsign(x) = x / (abs(x)+1)
No comments:
Post a Comment