Package smile.deep.activation
Class LeakyReLU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.LeakyReLU
- All Implemented Interfaces:
Serializable
,Function<Tensor,
,Tensor> Layer
Sigmoid Linear Unit activation function.
- See Also:
-
Constructor Details
-
LeakyReLU
public LeakyReLU()Constructor. -
LeakyReLU
public LeakyReLU(double negativeSlope, boolean inplace) Constructor.- Parameters:
negativeSlope
- Controls the angle of the negative slope, which is used for negative input values.inplace
- true if the operation executes in-place.
-
-
Method Details
-
forward
Description copied from interface:Layer
Forward propagation (or forward pass) through the layer.- Parameters:
input
- the input tensor.- Returns:
- the output tensor.
-