Class ELU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.ELU
- All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer
Exponential Linear Unit (ELU) activation function.
ELU is defined as:
xifx > 0alpha * (exp(x) - 1)ifx <= 0
Unlike ReLU, ELU produces negative outputs, pushing the mean activation closer to zero, which can speed up training.
- See Also:
-
Constructor Details
-
ELU
public ELU()Constructor with default alpha = 1.0. -
ELU
public ELU(double alpha, boolean inplace) Constructor.- Parameters:
alpha- the alpha value for the ELU formulation. Must be non-negative.inplace- true if the operation executes in-place.
-
-
Method Details
-
alpha
public double alpha()Returns the alpha value.- Returns:
- the alpha value.
-
forward
-