public class Subsampling1DLayer extends SubsamplingLayer
Layer.TrainingMode, Layer.Type
convolutionMode, helper
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score, solver
Constructor and Description |
---|
Subsampling1DLayer(NeuralNetConfiguration conf) |
Subsampling1DLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
accumulateScore, activationMean, calcGradient, calcL1, calcL2, computeGradientAndScore, error, fit, fit, getParam, isPretrainLayer, iterate, merge, numParams, params, score, setParams, transpose, type, update
activate, activate, activate, activate, activate, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, clear, clone, conf, createGradient, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, initParams, input, layerConf, layerNameAndIndex, numParams, paramTable, paramTable, preOutput, preOutput, preOutput, preOutput, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, validateInput
public Subsampling1DLayer(NeuralNetConfiguration conf)
public Subsampling1DLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class SubsamplingLayer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
activate
in interface Layer
activate
in class SubsamplingLayer
training
- training or test mode