public class CenterLossOutputLayer extends BaseOutputLayer<CenterLossOutputLayer>
Layer.TrainingMode, Layer.Type
inputMaskArray, inputMaskArrayState, labels
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score
Constructor and Description |
---|
CenterLossOutputLayer(NeuralNetConfiguration conf) |
CenterLossOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
void |
computeGradientAndScore()
Update the score
|
double |
computeScore(double fullNetworkL1,
double fullNetworkL2,
boolean training)
Compute score after labels and input have been set.
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2)
Compute the score for each example individually, after labels and input have been set.
|
Gradient |
gradient()
Gets the gradient from one training iteration
|
Pair<Gradient,java.lang.Double> |
gradientAndScore()
Get the gradient and score
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
activate, activate, activate, applyMask, clear, f1Score, f1Score, fit, fit, fit, fit, fit, getLabels, getLabels2d, isPretrainLayer, iterate, labelProbabilities, numLabels, output, output, output, predict, predict, preOutput2d, setLabels
accumulateScore, activate, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, batchSize, calcGradient, calcL1, calcL2, clone, conf, createGradient, derivativeActivation, error, feedForwardMaskArray, fit, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, init, initParams, input, layerConf, layerNameAndIndex, merge, numParams, numParams, params, paramTable, paramTable, preOutput, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, type, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
activate, activate, activate, activationMean, calcGradient, calcL1, calcL2, clone, derivativeActivation, error, feedForwardMaskArray, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, merge, preOutput, preOutput, preOutput, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, transpose, type
accumulateScore, applyLearningRateScoreDecay, batchSize, conf, fit, getOptimizer, getParam, init, initParams, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
public CenterLossOutputLayer(NeuralNetConfiguration conf)
public CenterLossOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double computeScore(double fullNetworkL1, double fullNetworkL2, boolean training)
computeScore
in interface IOutputLayer
computeScore
in class BaseOutputLayer<CenterLossOutputLayer>
fullNetworkL1
- L1 regularization term for the entire networkfullNetworkL2
- L2 regularization term for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1, double fullNetworkL2)
computeScoreForExamples
in interface IOutputLayer
computeScoreForExamples
in class BaseOutputLayer<CenterLossOutputLayer>
fullNetworkL1
- L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2
- L2 regularization term for the entire network (or, 0.0 to not include regularization)public void computeGradientAndScore()
Model
computeGradientAndScore
in interface Model
computeGradientAndScore
in class BaseOutputLayer<CenterLossOutputLayer>
protected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ
in class BaseOutputLayer<CenterLossOutputLayer>
public Pair<Gradient,java.lang.Double> gradientAndScore()
Model
gradientAndScore
in interface Model
gradientAndScore
in class BaseOutputLayer<CenterLossOutputLayer>
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseOutputLayer<CenterLossOutputLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public Gradient gradient()
gradient
in interface Model
gradient
in class BaseOutputLayer<CenterLossOutputLayer>