Class LeakyReLU
java.lang.Object
smile.deep.activation.ActivationFunction
smile.deep.activation.LeakyReLU
- All Implemented Interfaces:
Serializable, Function<Tensor,Tensor>, Layer
Sigmoid Linear Unit activation function.
- See Also:
-
Constructor Details
-
LeakyReLU
public LeakyReLU()Constructor. -
LeakyReLU
public LeakyReLU(double negativeSlope, boolean inplace) Constructor.- Parameters:
negativeSlope- Controls the angle of the negative slope, which is used for negative input values.inplace- true if the operation executes in-place.
-
-
Method Details
-
forward
-