public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer,
Serialized FormLayer.TrainingMode, Layer.TypeinputMaskArray, inputMaskArrayState, labelsconf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score| Constructor and Description |
|---|
RnnOutputLayer(NeuralNetConfiguration conf) |
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
org.nd4j.linalg.api.ndarray.INDArray |
getInput() |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d() |
org.nd4j.linalg.api.ndarray.INDArray |
output(boolean training)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
output(org.nd4j.linalg.api.ndarray.INDArray input) |
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput2d(boolean training) |
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
activate, activate, activate, applyMask, clear, computeGradientAndScore, computeScore, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, isPretrainLayer, iterate, labelProbabilities, numLabels, output, predict, predict, setLabels, setScoreWithZaccumulateScore, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, batchSize, calcGradient, calcL1, calcL2, clone, conf, createGradient, derivativeActivation, error, fit, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, init, initParams, input, layerConf, layerNameAndIndex, merge, numParams, numParams, params, paramTable, paramTable, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitactivate, activate, activationMean, calcGradient, calcL1, calcL2, clone, derivativeActivation, error, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, merge, preOutput, preOutput, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, transposeaccumulateScore, applyLearningRateScoreDecay, batchSize, conf, fit, getOptimizer, getParam, init, initParams, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInputpublic RnnOutputLayer(NeuralNetConfiguration conf)
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseOutputLayer<RnnOutputLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score in interface Classifierf1Score in class BaseOutputLayer<RnnOutputLayer>examples - te the examples to classify (one example in each row)labels - the true labelspublic org.nd4j.linalg.api.ndarray.INDArray getInput()
getInput in class BaseLayer<RnnOutputLayer>public Layer.Type type()
Layertype in interface Layertype in class BaseLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
LayerpreOutput in interface LayerpreOutput in class BaseLayer<RnnOutputLayer>x - the input to transformprotected org.nd4j.linalg.api.ndarray.INDArray preOutput2d(boolean training)
preOutput2d in class BaseOutputLayer<RnnOutputLayer>protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d()
getLabels2d in class BaseOutputLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray output(org.nd4j.linalg.api.ndarray.INDArray input)
output in class BaseOutputLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray output(boolean training)
BaseOutputLayeroutput in class BaseOutputLayer<RnnOutputLayer>training - determines if its training
the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layeractivate in interface Layeractivate in class BaseLayer<RnnOutputLayer>training - training or test modepublic void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
LayerLayer.feedForwardMaskArray(INDArray, MaskState, int) should be used in
preference to this.setMaskArray in interface LayersetMaskArray in class BaseLayer<RnnOutputLayer>maskArray - Mask array to setpublic Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
LayerfeedForwardMaskArray in interface LayerfeedForwardMaskArray in class BaseLayer<RnnOutputLayer>maskArray - Mask array to setcurrentMaskState - Current state of the mask - see MaskStateminibatchSize - Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2)
computeScoreForExamples in interface IOutputLayercomputeScoreForExamples in class BaseOutputLayer<RnnOutputLayer>fullNetworkL1 - L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2 - L2 regularization term for the entire network (or, 0.0 to not include regularization)