Class InputLayer

java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.InputLayer
All Implemented Interfaces:
Serializable

public class InputLayer extends Layer
An input layer in the neural network.
See Also:
  • Constructor Details

    • InputLayer

      public InputLayer(int p)
      Constructor.
      Parameters:
      p - the number of input variables (not including bias value).
    • InputLayer

      public InputLayer(int p, double dropout)
      Constructor.
      Parameters:
      p - the number of input variables (not including bias value).
      dropout - the dropout rate.
  • Method Details

    • toString

      public String toString()
      Overrides:
      toString in class Object
    • propagate

      public void propagate(double[] x)
      Description copied from class: Layer
      Propagates the signals from a lower layer to this layer.
      Overrides:
      propagate in class Layer
      Parameters:
      x - the lower layer signals.
    • backpropagate

      public void backpropagate(double[] lowerLayerGradient)
      Description copied from class: Layer
      Propagates the errors back to a lower layer.
      Specified by:
      backpropagate in class Layer
      Parameters:
      lowerLayerGradient - the gradient vector of lower layer.
    • transform

      public void transform(double[] x)
      Description copied from class: Layer
      The activation or output function.
      Specified by:
      transform in class Layer
      Parameters:
      x - the input and output values.
    • computeGradient

      public void computeGradient(double[] x)
      Description copied from class: Layer
      Computes the parameter gradient for a sample of (mini-)batch.
      Overrides:
      computeGradient in class Layer
      Parameters:
      x - the input vector.
    • computeGradientUpdate

      public void computeGradientUpdate(double[] x, double learningRate, double momentum, double decay)
      Description copied from class: Layer
      Computes the parameter gradient and update the weights.
      Overrides:
      computeGradientUpdate in class Layer
      Parameters:
      x - the input vector.
      learningRate - the learning rate.
      momentum - the momentum factor.
      decay - weight decay factor.
    • update

      public void update(int m, double learningRate, double momentum, double decay, double rho, double epsilon)
      Description copied from class: Layer
      Adjust network weights by back-propagation algorithm.
      Overrides:
      update in class Layer
      Parameters:
      m - the size of mini-batch.
      learningRate - the learning rate.
      momentum - the momentum factor.
      decay - weight decay factor.
      rho - RMSProp discounting factor for the history/coming gradient.
      epsilon - a small constant for numerical stability.