Class Hardswish
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.Hardswish
- All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer
Hard Swish activation function.
Hard Swish is a computationally efficient approximation of Swish/SiLU:
hardswish(x) = x * hardsigmoid(x)
= x * ReLU6(x + 3) / 6
It is used in MobileNetV3 and EfficientNetV2 to reduce computational cost compared to sigmoid-based Swish.
- See Also:
-
Constructor Details
-
Hardswish
public Hardswish()Constructor. -
Hardswish
public Hardswish(boolean inplace) Constructor.- Parameters:
inplace- true if the operation executes in-place.
-
-
Method Details
-
forward
-