public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer
,
Serialized FormLayer.TrainingMode, Layer.Type
inputMaskArray, inputMaskArrayState, labels
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score
Constructor and Description |
---|
RnnOutputLayer(NeuralNetConfiguration conf) |
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
org.nd4j.linalg.api.ndarray.INDArray |
getInput() |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d() |
org.nd4j.linalg.api.ndarray.INDArray |
output(boolean training)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
output(org.nd4j.linalg.api.ndarray.INDArray input) |
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput2d(boolean training) |
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
activate, activate, activate, applyMask, clear, computeGradientAndScore, computeScore, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, isPretrainLayer, iterate, labelProbabilities, numLabels, output, predict, predict, setLabels, setScoreWithZ
accumulateScore, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, batchSize, calcGradient, calcL1, calcL2, clone, conf, createGradient, derivativeActivation, error, fit, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, init, initParams, input, layerConf, layerNameAndIndex, merge, numParams, numParams, params, paramTable, paramTable, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
activate, activate, activationMean, calcGradient, calcL1, calcL2, clone, derivativeActivation, error, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, merge, preOutput, preOutput, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, transpose
accumulateScore, applyLearningRateScoreDecay, batchSize, conf, fit, getOptimizer, getParam, init, initParams, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
public RnnOutputLayer(NeuralNetConfiguration conf)
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseOutputLayer<RnnOutputLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples, org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score
in interface Classifier
f1Score
in class BaseOutputLayer<RnnOutputLayer>
examples
- te the examples to classify (one example in each row)labels
- the true labelspublic org.nd4j.linalg.api.ndarray.INDArray getInput()
getInput
in class BaseLayer<RnnOutputLayer>
public Layer.Type type()
Layer
type
in interface Layer
type
in class BaseLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, boolean training)
Layer
preOutput
in interface Layer
preOutput
in class BaseLayer<RnnOutputLayer>
x
- the input to transformprotected org.nd4j.linalg.api.ndarray.INDArray preOutput2d(boolean training)
preOutput2d
in class BaseOutputLayer<RnnOutputLayer>
protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d()
getLabels2d
in class BaseOutputLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray output(org.nd4j.linalg.api.ndarray.INDArray input)
output
in class BaseOutputLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray output(boolean training)
BaseOutputLayer
output
in class BaseOutputLayer<RnnOutputLayer>
training
- determines if its training
the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
activate
in interface Layer
activate
in class BaseLayer<RnnOutputLayer>
training
- training or test modepublic void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Layer
Layer.feedForwardMaskArray(INDArray, MaskState, int)
should be used in
preference to this.setMaskArray
in interface Layer
setMaskArray
in class BaseLayer<RnnOutputLayer>
maskArray
- Mask array to setpublic Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
feedForwardMaskArray
in class BaseLayer<RnnOutputLayer>
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1, double fullNetworkL2)
computeScoreForExamples
in interface IOutputLayer
computeScoreForExamples
in class BaseOutputLayer<RnnOutputLayer>
fullNetworkL1
- L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2
- L2 regularization term for the entire network (or, 0.0 to not include regularization)