Package smile.deep.activation
Class SiLU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.SiLU
- All Implemented Interfaces:
Serializable
,Function<Tensor,
,Tensor> Layer
Sigmoid Linear Unit activation function.
- See Also:
-
Constructor Summary
-
Method Summary
Methods inherited from class smile.deep.activation.ActivationFunction
asTorch, isInplace, name
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface smile.deep.layer.Layer
apply, isTraining
-
Constructor Details
-
SiLU
public SiLU(boolean inplace) Constructor.- Parameters:
inplace
- true if the operation executes in-place.
-
-
Method Details
-
forward
Description copied from interface:Layer
Forward propagation (or forward pass) through the layer.- Parameters:
input
- the input tensor.- Returns:
- the output tensor.
-