Class ReLU

All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer

public class ReLU extends ActivationFunction
Rectified Linear Unit activation function.
See Also:
  • Constructor Details

    • ReLU

      public ReLU(boolean inplace)
      Constructor.
      Parameters:
      inplace - true if the operation executes in-place.
  • Method Details

    • forward

      public Tensor forward(Tensor input)
      Description copied from interface: Layer
      Forward propagation (or forward pass) through the layer.
      Parameters:
      input - the input tensor.
      Returns:
      the output tensor.