Package smile.base.mlp
Class InputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.InputLayer
- All Implemented Interfaces:
Serializable
An input layer in the neural network.
- See Also:
-
Field Summary
Fields inherited from class smile.base.mlp.Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate
-
Constructor Summary
ConstructorDescriptionInputLayer
(int p) Constructor.InputLayer
(int p, double dropout) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoid
backpropagate
(double[] lowerLayerGradient) Propagates the errors back to a lower layer.void
computeGradient
(double[] x) Computes the parameter gradient for a sample of (mini-)batch.void
computeGradientUpdate
(double[] x, double learningRate, double momentum, double decay) Computes the parameter gradient and update the weights.void
propagate
(double[] x) Propagates the signals from a lower layer to this layer.toString()
void
transform
(double[] x) The activation or output function.void
update
(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Adjust network weights by back-propagation algorithm.
-
Constructor Details
-
InputLayer
public InputLayer(int p) Constructor.- Parameters:
p
- the number of input variables (not including bias value).
-
InputLayer
public InputLayer(int p, double dropout) Constructor.- Parameters:
p
- the number of input variables (not including bias value).dropout
- the dropout rate.
-
-
Method Details
-
toString
-
propagate
public void propagate(double[] x) Description copied from class:Layer
Propagates the signals from a lower layer to this layer. -
backpropagate
public void backpropagate(double[] lowerLayerGradient) Description copied from class:Layer
Propagates the errors back to a lower layer.- Specified by:
backpropagate
in classLayer
- Parameters:
lowerLayerGradient
- the gradient vector of lower layer.
-
transform
public void transform(double[] x) Description copied from class:Layer
The activation or output function. -
computeGradient
public void computeGradient(double[] x) Description copied from class:Layer
Computes the parameter gradient for a sample of (mini-)batch.- Overrides:
computeGradient
in classLayer
- Parameters:
x
- the input vector.
-
computeGradientUpdate
public void computeGradientUpdate(double[] x, double learningRate, double momentum, double decay) Description copied from class:Layer
Computes the parameter gradient and update the weights.- Overrides:
computeGradientUpdate
in classLayer
- Parameters:
x
- the input vector.learningRate
- the learning rate.momentum
- the momentum factor.decay
- weight decay factor.
-
update
public void update(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Description copied from class:Layer
Adjust network weights by back-propagation algorithm.
-