Class LeakyReLU

All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer

public class LeakyReLU extends ActivationFunction
Sigmoid Linear Unit activation function.
See Also:
  • Constructor Details

    • LeakyReLU

      public LeakyReLU()
      Constructor.
    • LeakyReLU

      public LeakyReLU(double negativeSlope, boolean inplace)
      Constructor.
      Parameters:
      negativeSlope - Controls the angle of the negative slope, which is used for negative input values.
      inplace - true if the operation executes in-place.
  • Method Details

    • forward

      public Tensor forward(Tensor input)
      Description copied from interface: Layer
      Forward propagation (or forward pass) through the layer.
      Parameters:
      input - the input tensor.
      Returns:
      the output tensor.