Class ELU

All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer

public class ELU extends ActivationFunction
Exponential Linear Unit (ELU) activation function.

ELU is defined as:

  • x if x > 0
  • alpha * (exp(x) - 1) if x <= 0

Unlike ReLU, ELU produces negative outputs, pushing the mean activation closer to zero, which can speed up training.

See Also:
  • Constructor Details

    • ELU

      public ELU()
      Constructor with default alpha = 1.0.
    • ELU

      public ELU(double alpha, boolean inplace)
      Constructor.
      Parameters:
      alpha - the alpha value for the ELU formulation. Must be non-negative.
      inplace - true if the operation executes in-place.
  • Method Details

    • alpha

      public double alpha()
      Returns the alpha value.
      Returns:
      the alpha value.
    • forward

      public Tensor forward(Tensor input)
      Description copied from interface: Layer
      Forward propagation (or forward pass) through the layer.
      Parameters:
      input - the input tensor.
      Returns:
      the output tensor.