Package smile.deep.activation
Class LeakyReLU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.LeakyReLU
- All Implemented Interfaces:
Serializable
,Function<Tensor,
,Tensor> Layer
Sigmoid Linear Unit activation function.
- See Also:
-
Constructor Summary
-
Method Summary
Methods inherited from class smile.deep.activation.ActivationFunction
asTorch, isInplace, name
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface smile.deep.layer.Layer
apply, isTraining
-
Constructor Details
-
LeakyReLU
public LeakyReLU()Constructor. -
LeakyReLU
public LeakyReLU(double negativeSlope, boolean inplace) Constructor.- Parameters:
negativeSlope
- Controls the angle of the negative slope, which is used for negative input values.inplace
- true if the operation executes in-place.
-
-
Method Details
-
forward
Description copied from interface:Layer
Forward propagation (or forward pass) through the layer.- Parameters:
input
- the input tensor.- Returns:
- the output tensor.
-