Package smile.base.mlp
Class HiddenLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.HiddenLayer
- All Implemented Interfaces:
Serializable
A hidden layer in the neural network.
- See Also:
-
Field Summary
Fields inherited from class smile.base.mlp.Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate
-
Constructor Summary
ConstructorDescriptionHiddenLayer
(int n, int p, double dropout, ActivationFunction activation) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoid
backpropagate
(double[] lowerLayerGradient) Propagates the errors back to a lower layer.toString()
void
transform
(double[] x) The activation or output function.Methods inherited from class smile.base.mlp.Layer
backpopagateDropout, builder, computeGradient, computeGradientUpdate, getInputSize, getOutputSize, gradient, input, input, leaky, leaky, leaky, linear, linear, mle, mse, of, output, propagate, propagateDropout, rectifier, rectifier, sigmoid, sigmoid, tanh, tanh, update
-
Constructor Details
-
HiddenLayer
Constructor.- Parameters:
n
- the number of neurons.p
- the number of input variables (not including bias value).dropout
- the dropout rate.activation
- the activation function.
-
-
Method Details
-
toString
-
transform
public void transform(double[] x) Description copied from class:Layer
The activation or output function. -
backpropagate
public void backpropagate(double[] lowerLayerGradient) Description copied from class:Layer
Propagates the errors back to a lower layer.- Specified by:
backpropagate
in classLayer
- Parameters:
lowerLayerGradient
- the gradient vector of lower layer.
-