Fix the formula of SELU/selu (#26675)

revert-26856-strategy_example2
hong19860320 5 years ago committed by GitHub
parent c70bc3bba3
commit dbcef732d8
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -644,7 +644,11 @@ def selu(x,
.. math::
selu(x) = scale * (max(0,x) + min(0, alpha * (e^{x} - 1)))
selu(x)= scale *
\\begin{cases}
x, \\text{if } x > 0 \\\\
alpha * e^{x} - alpha, \\text{if } x <= 0
\\end{cases}
Parameters:
x (Tensor): The input Tensor with data type float32, float64.

@ -552,7 +552,11 @@ class SELU(layers.Layer):
.. math::
SELU(x) = scale * (max(0,x) + min(0, alpha * (e^{x} - 1)))
SELU(x)= scale *
\\begin{cases}
x, \\text{if } x > 0 \\\\
alpha * e^{x} - alpha, \\text{if } x <= 0
\\end{cases}
Parameters:
scale (float, optional): The value of scale for SELU. Default is 1.0507009873554804934193349852946

Loading…
Cancel
Save