Package smile.deep.activation
Class GELU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.GELU
- All Implemented Interfaces:
Serializable
,Function<Tensor,
,Tensor> Layer
Gaussian Error Linear Unit activation function.
- See Also:
-
Constructor Details
-
GELU
public GELU(boolean inplace) Constructor.- Parameters:
inplace
- true if the operation executes in-place.
-
-
Method Details
-
forward
Description copied from interface:Layer
Forward propagation (or forward pass) through the layer.- Parameters:
input
- the input tensor.- Returns:
- the output tensor.
-