Package smile.deep.activation
Class ActivationFunction
java.lang.Object
smile.deep.activation.ActivationFunction
- All Implemented Interfaces:
Serializable
,Function<Tensor,
,Tensor> Layer
- Direct Known Subclasses:
GELU
,GLU
,HardShrink
,LeakyReLU
,LogSigmoid
,LogSoftmax
,ReLU
,Sigmoid
,SiLU
,Softmax
,SoftShrink
,Tanh
,TanhShrink
The activation function. It also implements the layer interface
so that it can be treated added into a network as a layer.
- See Also:
-
Constructor Details
-
ActivationFunction
Constructor.- Parameters:
name
- the function name.inplace
- true if the operation executes in-place.
-
-
Method Details
-
name
Returns the name of activation function.- Returns:
- the name of activation function.
-
isInplace
public boolean isInplace()Returns true if the operation executes in-place.- Returns:
- true if the operation executes in-place.
-
asTorch
public org.bytedeco.pytorch.Module asTorch()Description copied from interface:Layer
Returns the PyTorch Module object.
-