Class OutputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.OutputLayer
- All Implemented Interfaces:
Serializable, AutoCloseable
-
Field Summary
Fields inherited from class Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate -
Constructor Summary
ConstructorsConstructorDescriptionOutputLayer(int n, int p, OutputFunction activation, Cost cost) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoidbackpropagate(Vector lowerLayerGradient) Propagates the errors back to a lower layer.voidcomputeOutputGradient(Vector target, double weight) Compute the network output gradient.cost()Returns the cost function of neural network.toString()voidThe activation or output function.Methods inherited from class Layer
backpopagateDropout, bias, builder, close, computeGradient, computeGradientUpdate, getInputSize, getOutputSize, gradient, input, input, leaky, leaky, leaky, linear, linear, mle, mse, of, output, propagate, propagateDropout, rectifier, rectifier, sigmoid, sigmoid, tanh, tanh, update, weight
-
Constructor Details
-
OutputLayer
Constructor.- Parameters:
n- the number of neurons.p- the number of input variables (not including bias value).activation- the output activation function.cost- the cost function.
-
-
Method Details
-
toString
-
cost
-
transform
-
backpropagate
Description copied from class:LayerPropagates the errors back to a lower layer.- Specified by:
backpropagatein classLayer- Parameters:
lowerLayerGradient- the gradient vector of lower layer.
-
computeOutputGradient
Compute the network output gradient.- Parameters:
target- the desired output.weight- a positive weight value associated with the training instance.
-