public class SubsamplingLayer extends BaseLayer<SubsamplingLayer>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected ConvolutionMode |
convolutionMode |
protected SubsamplingHelper |
helper |
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score, solver
Constructor and Description |
---|
SubsamplingLayer(NeuralNetConfiguration conf) |
SubsamplingLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
void |
accumulateScore(double accum)
Sets a rolling tally for the score.
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activationMean()
Calculate the mean representation
for the activation for this layer
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
Gradient |
calcGradient(Gradient layerError,
org.nd4j.linalg.api.ndarray.INDArray indArray)
Calculate the gradient
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
computeGradientAndScore()
Update the score
|
Gradient |
error(org.nd4j.linalg.api.ndarray.INDArray input)
Calculate error with respect to the
current layer.
|
void |
fit()
All models have a fit method
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(java.lang.String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (VAE, RBMs etc)
|
void |
iterate(org.nd4j.linalg.api.ndarray.INDArray input)
iterate one iteration of the network
|
void |
merge(Layer layer,
int batchSize)
Averages the given logistic regression from a mini batch into this layer
|
int |
numParams()
The number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
double |
score()
Objective function: the specified objective
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
void |
update(org.nd4j.linalg.api.ndarray.INDArray gradient,
java.lang.String paramType)
Perform one update applying the gradient
|
activate, activate, activate, activate, activate, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, clear, clone, conf, createGradient, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, initParams, input, layerConf, layerNameAndIndex, numParams, paramTable, paramTable, preOutput, preOutput, preOutput, preOutput, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, validateInput
protected SubsamplingHelper helper
protected ConvolutionMode convolutionMode
public SubsamplingLayer(NeuralNetConfiguration conf)
public SubsamplingLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class BaseLayer<SubsamplingLayer>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class BaseLayer<SubsamplingLayer>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class BaseLayer<SubsamplingLayer>
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<SubsamplingLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
activate
in interface Layer
activate
in class BaseLayer<SubsamplingLayer>
training
- training or test modepublic Gradient error(org.nd4j.linalg.api.ndarray.INDArray input)
Layer
error
in interface Layer
error
in class BaseLayer<SubsamplingLayer>
input
- the gradient for the forward layer
If this is the final layer, it will start
with the error from the output.
This is on the user to initialize.public Gradient calcGradient(Gradient layerError, org.nd4j.linalg.api.ndarray.INDArray indArray)
Layer
calcGradient
in interface Layer
calcGradient
in class BaseLayer<SubsamplingLayer>
layerError
- the layer errorpublic void merge(Layer layer, int batchSize)
BaseLayer
merge
in interface Layer
merge
in class BaseLayer<SubsamplingLayer>
layer
- the logistic regression layer to average into this layerbatchSize
- the batch sizepublic org.nd4j.linalg.api.ndarray.INDArray activationMean()
Layer
activationMean
in interface Layer
activationMean
in class BaseLayer<SubsamplingLayer>
public Layer transpose()
Layer
transpose
in interface Layer
transpose
in class BaseLayer<SubsamplingLayer>
public boolean isPretrainLayer()
Layer
public void iterate(org.nd4j.linalg.api.ndarray.INDArray input)
BaseLayer
iterate
in interface Model
iterate
in class BaseLayer<SubsamplingLayer>
input
- the input to iterate onpublic void fit()
Model
fit
in interface Model
fit
in class BaseLayer<SubsamplingLayer>
public int numParams()
BaseLayer
numParams
in interface Model
numParams
in class BaseLayer<SubsamplingLayer>
public void fit(org.nd4j.linalg.api.ndarray.INDArray input)
Model
fit
in interface Model
fit
in class BaseLayer<SubsamplingLayer>
input
- the data to fit the model topublic void computeGradientAndScore()
Model
computeGradientAndScore
in interface Model
computeGradientAndScore
in class BaseLayer<SubsamplingLayer>
public double score()
BaseLayer
score
in interface Model
score
in class BaseLayer<SubsamplingLayer>
public void accumulateScore(double accum)
Model
accumulateScore
in interface Model
accumulateScore
in class BaseLayer<SubsamplingLayer>
accum
- the amount to accumpublic void update(org.nd4j.linalg.api.ndarray.INDArray gradient, java.lang.String paramType)
Model
update
in interface Model
update
in class BaseLayer<SubsamplingLayer>
gradient
- the gradient to applypublic org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayer
params
in interface Model
params
in class BaseLayer<SubsamplingLayer>
public org.nd4j.linalg.api.ndarray.INDArray getParam(java.lang.String param)
Model
getParam
in interface Model
getParam
in class BaseLayer<SubsamplingLayer>
param
- the key of the parameterpublic void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Model
setParams
in interface Model
setParams
in class BaseLayer<SubsamplingLayer>
params
- the parameters for the model