Class InputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.InputLayer
- All Implemented Interfaces:
Serializable, AutoCloseable
-
Field Summary
Fields inherited from class Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate -
Constructor Summary
ConstructorsConstructorDescriptionInputLayer(int p) Constructor.InputLayer(int p, double dropout) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoidbackpropagate(Vector lowerLayerGradient) Propagates the errors back to a lower layer.voidComputes the parameter gradient for a sample of (mini-)batch.voidcomputeGradientUpdate(Vector x, double learningRate, double momentum, double decay) Computes the parameter gradient and update the weights.voidPropagates the signals from a lower layer to this layer.toString()voidThe activation or output function.voidupdate(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Adjust network weights by back-propagation algorithm.
-
Constructor Details
-
InputLayer
public InputLayer(int p) Constructor.- Parameters:
p- the number of input variables (not including bias value).
-
InputLayer
public InputLayer(int p, double dropout) Constructor.- Parameters:
p- the number of input variables (not including bias value).dropout- the dropout rate.
-
-
Method Details
-
toString
-
propagate
-
backpropagate
Description copied from class:LayerPropagates the errors back to a lower layer.- Specified by:
backpropagatein classLayer- Parameters:
lowerLayerGradient- the gradient vector of lower layer.
-
transform
-
computeGradient
Description copied from class:LayerComputes the parameter gradient for a sample of (mini-)batch.- Overrides:
computeGradientin classLayer- Parameters:
x- the input vector.
-
computeGradientUpdate
Description copied from class:LayerComputes the parameter gradient and update the weights.- Overrides:
computeGradientUpdatein classLayer- Parameters:
x- the input vector.learningRate- the learning rate.momentum- the momentum factor.decay- weight decay factor.
-
update
public void update(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Description copied from class:LayerAdjust network weights by back-propagation algorithm.
-