Class ActivationFunction

java.lang.Object
smile.deep.activation.ActivationFunction
All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer
Direct Known Subclasses:
GELU, GLU, HardShrink, LeakyReLU, LogSigmoid, LogSoftmax, ReLU, Sigmoid, SiLU, Softmax, SoftShrink, Tanh, TanhShrink

public abstract class ActivationFunction extends Object implements Layer, Serializable
The activation function. It also implements the layer interface so that it can be treated added into a network as a layer.
See Also:
  • Constructor Details

    • ActivationFunction

      public ActivationFunction(String name, boolean inplace)
      Constructor.
      Parameters:
      name - the function name.
      inplace - true if the operation executes in-place.
  • Method Details

    • name

      public String name()
      Returns the name of activation function.
      Returns:
      the name of activation function.
    • isInplace

      public boolean isInplace()
      Returns true if the operation executes in-place.
      Returns:
      true if the operation executes in-place.
    • asTorch

      public org.bytedeco.pytorch.Module asTorch()
      Description copied from interface: Layer
      Returns the PyTorch Module object.
      Specified by:
      asTorch in interface Layer
      Returns:
      the PyTorch Module object.