Class BatchNorm2dLayer

java.lang.Object
smile.deep.layer.BatchNorm2dLayer
All Implemented Interfaces:
Function<Tensor,Tensor>, Layer

public class BatchNorm2dLayer extends Object implements Layer
A batch normalization layer that re-centers and normalizes the output of one layer before feeding it to another. Centering and scaling the intermediate tensors has a number of beneficial effects, such as allowing higher learning rates without exploding/vanishing gradients.
  • Constructor Details

    • BatchNorm2dLayer

      public BatchNorm2dLayer(int channels)
      Constructor.
      Parameters:
      channels - the number of input channels in (N,C,H,W).
    • BatchNorm2dLayer

      public BatchNorm2dLayer(int channels, double eps, double momentum, boolean affine)
      Constructor.
      Parameters:
      channels - the number of input channels in (N,C,H,W).
      eps - a value added to the denominator for numerical stability.
      momentum - the value used for the running_mean and running_var computation. Can be set to 0.0 for cumulative moving average (i.e. simple average).
      affine - when set to true, this layer has learnable affine parameters.
  • Method Details

    • asTorch

      public org.bytedeco.pytorch.Module asTorch()
      Description copied from interface: Layer
      Returns the PyTorch Module object.
      Specified by:
      asTorch in interface Layer
      Returns:
      the PyTorch Module object.
    • forward

      public Tensor forward(Tensor input)
      Description copied from interface: Layer
      Forward propagation (or forward pass) through the layer.
      Specified by:
      forward in interface Layer
      Parameters:
      input - the input tensor.
      Returns:
      the output tensor.