Class InputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.InputLayer
- All Implemented Interfaces:
Serializable, AutoCloseable
-
Field Summary
Fields inherited from class Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate
-
Constructor Summary
ConstructorsConstructorDescriptionInputLayer
(int p) Constructor.InputLayer
(int p, double dropout) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoid
backpropagate
(Vector lowerLayerGradient) Propagates the errors back to a lower layer.void
Computes the parameter gradient for a sample of (mini-)batch.void
computeGradientUpdate
(Vector x, double learningRate, double momentum, double decay) Computes the parameter gradient and update the weights.void
Propagates the signals from a lower layer to this layer.toString()
void
The activation or output function.void
update
(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Adjust network weights by back-propagation algorithm.
-
Constructor Details
-
InputLayer
public InputLayer(int p) Constructor.- Parameters:
p
- the number of input variables (not including bias value).
-
InputLayer
public InputLayer(int p, double dropout) Constructor.- Parameters:
p
- the number of input variables (not including bias value).dropout
- the dropout rate.
-
-
Method Details
-
toString
-
propagate
-
backpropagate
Description copied from class:Layer
Propagates the errors back to a lower layer.- Specified by:
backpropagate
in classLayer
- Parameters:
lowerLayerGradient
- the gradient vector of lower layer.
-
transform
-
computeGradient
Description copied from class:Layer
Computes the parameter gradient for a sample of (mini-)batch.- Overrides:
computeGradient
in classLayer
- Parameters:
x
- the input vector.
-
computeGradientUpdate
Description copied from class:Layer
Computes the parameter gradient and update the weights.- Overrides:
computeGradientUpdate
in classLayer
- Parameters:
x
- the input vector.learningRate
- the learning rate.momentum
- the momentum factor.decay
- weight decay factor.
-
update
public void update(int m, double learningRate, double momentum, double decay, double rho, double epsilon) Description copied from class:Layer
Adjust network weights by back-propagation algorithm.
-