Package smile.deep.layer
Class BatchNorm1dLayer
java.lang.Object
smile.deep.layer.BatchNorm1dLayer
A batch normalization layer that re-centers and normalizes the output
of one layer before feeding it to another. Centering and scaling the
intermediate tensors has a number of beneficial effects, such as allowing
higher learning rates without exploding/vanishing gradients.
-
Constructor Summary
ConstructorDescriptionBatchNorm1dLayer
(int channels) Constructor.BatchNorm1dLayer
(int channels, double eps, double momentum, boolean affine) Constructor. -
Method Summary
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface smile.deep.layer.Layer
apply, isTraining
-
Constructor Details
-
BatchNorm1dLayer
public BatchNorm1dLayer(int channels) Constructor.- Parameters:
channels
- the number of input channels.
-
BatchNorm1dLayer
public BatchNorm1dLayer(int channels, double eps, double momentum, boolean affine) Constructor.- Parameters:
channels
- the number of input channels.eps
- a value added to the denominator for numerical stability.momentum
- the value used for the running_mean and running_var computation. Can be set to 0.0 for cumulative moving average (i.e. simple average).affine
- when set to true, this layer has learnable affine parameters.
-
-
Method Details