public class SubsamplingLayer extends BaseLayer<SubsamplingLayer>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected ConvolutionMode |
convolutionMode |
protected SubsamplingHelper |
helper |
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score, solver| Constructor and Description |
|---|
SubsamplingLayer(NeuralNetConfiguration conf) |
SubsamplingLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
void |
accumulateScore(double accum)
Sets a rolling tally for the score.
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activationMean()
Calculate the mean representation
for the activation for this layer
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
Gradient |
calcGradient(Gradient layerError,
org.nd4j.linalg.api.ndarray.INDArray indArray)
Calculate the gradient
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
computeGradientAndScore()
Update the score
|
Gradient |
error(org.nd4j.linalg.api.ndarray.INDArray input)
Calculate error with respect to the
current layer.
|
void |
fit()
All models have a fit method
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(java.lang.String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (VAE, RBMs etc)
|
void |
iterate(org.nd4j.linalg.api.ndarray.INDArray input)
iterate one iteration of the network
|
void |
merge(Layer layer,
int batchSize)
Averages the given logistic regression from a mini batch into this layer
|
int |
numParams()
The number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
double |
score()
Objective function: the specified objective
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
void |
update(org.nd4j.linalg.api.ndarray.INDArray gradient,
java.lang.String paramType)
Perform one update applying the gradient
|
activate, activate, activate, activate, activate, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, clear, clone, conf, createGradient, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, initParams, input, layerConf, layerNameAndIndex, numParams, paramTable, paramTable, preOutput, preOutput, preOutput, preOutput, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, validateInputprotected SubsamplingHelper helper
protected ConvolutionMode convolutionMode
public SubsamplingLayer(NeuralNetConfiguration conf)
public SubsamplingLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<SubsamplingLayer>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<SubsamplingLayer>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class BaseLayer<SubsamplingLayer>public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<SubsamplingLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layeractivate in interface Layeractivate in class BaseLayer<SubsamplingLayer>training - training or test modepublic Gradient error(org.nd4j.linalg.api.ndarray.INDArray input)
Layererror in interface Layererror in class BaseLayer<SubsamplingLayer>input - the gradient for the forward layer
If this is the final layer, it will start
with the error from the output.
This is on the user to initialize.public Gradient calcGradient(Gradient layerError, org.nd4j.linalg.api.ndarray.INDArray indArray)
LayercalcGradient in interface LayercalcGradient in class BaseLayer<SubsamplingLayer>layerError - the layer errorpublic void merge(Layer layer, int batchSize)
BaseLayermerge in interface Layermerge in class BaseLayer<SubsamplingLayer>layer - the logistic regression layer to average into this layerbatchSize - the batch sizepublic org.nd4j.linalg.api.ndarray.INDArray activationMean()
LayeractivationMean in interface LayeractivationMean in class BaseLayer<SubsamplingLayer>public Layer transpose()
Layertranspose in interface Layertranspose in class BaseLayer<SubsamplingLayer>public boolean isPretrainLayer()
Layerpublic void iterate(org.nd4j.linalg.api.ndarray.INDArray input)
BaseLayeriterate in interface Modeliterate in class BaseLayer<SubsamplingLayer>input - the input to iterate onpublic void fit()
Modelfit in interface Modelfit in class BaseLayer<SubsamplingLayer>public int numParams()
BaseLayernumParams in interface ModelnumParams in class BaseLayer<SubsamplingLayer>public void fit(org.nd4j.linalg.api.ndarray.INDArray input)
Modelfit in interface Modelfit in class BaseLayer<SubsamplingLayer>input - the data to fit the model topublic void computeGradientAndScore()
ModelcomputeGradientAndScore in interface ModelcomputeGradientAndScore in class BaseLayer<SubsamplingLayer>public double score()
BaseLayerscore in interface Modelscore in class BaseLayer<SubsamplingLayer>public void accumulateScore(double accum)
ModelaccumulateScore in interface ModelaccumulateScore in class BaseLayer<SubsamplingLayer>accum - the amount to accumpublic void update(org.nd4j.linalg.api.ndarray.INDArray gradient,
java.lang.String paramType)
Modelupdate in interface Modelupdate in class BaseLayer<SubsamplingLayer>gradient - the gradient to applypublic org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayerparams in interface Modelparams in class BaseLayer<SubsamplingLayer>public org.nd4j.linalg.api.ndarray.INDArray getParam(java.lang.String param)
ModelgetParam in interface ModelgetParam in class BaseLayer<SubsamplingLayer>param - the key of the parameterpublic void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
ModelsetParams in interface ModelsetParams in class BaseLayer<SubsamplingLayer>params - the parameters for the model