Class Hardswish

All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer

public class Hardswish extends ActivationFunction
Hard Swish activation function.

Hard Swish is a computationally efficient approximation of Swish/SiLU:

  hardswish(x) = x * hardsigmoid(x)
               = x * ReLU6(x + 3) / 6

It is used in MobileNetV3 and EfficientNetV2 to reduce computational cost compared to sigmoid-based Swish.

See Also:
  • Constructor Details

    • Hardswish

      public Hardswish()
      Constructor.
    • Hardswish

      public Hardswish(boolean inplace)
      Constructor.
      Parameters:
      inplace - true if the operation executes in-place.
  • Method Details

    • forward

      public Tensor forward(Tensor input)
      Description copied from interface: Layer
      Forward propagation (or forward pass) through the layer.
      Parameters:
      input - the input tensor.
      Returns:
      the output tensor.