public class BatchNormalization extends BaseLayer<BatchNormalization>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected int |
index |
protected java.util.List<IterationListener> |
listeners |
protected static org.slf4j.Logger |
log |
protected org.nd4j.linalg.api.ndarray.INDArray |
std |
protected org.nd4j.linalg.api.ndarray.INDArray |
xHat |
protected org.nd4j.linalg.api.ndarray.INDArray |
xMu |
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score, solver| Constructor and Description |
|---|
BatchNormalization(NeuralNetConfiguration conf) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(Layer.TrainingMode training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
Gradient |
calcGradient(Gradient layerError,
org.nd4j.linalg.api.ndarray.INDArray indArray)
Calculate the gradient
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
Layer |
clone()
Clone the layer
|
Gradient |
error(org.nd4j.linalg.api.ndarray.INDArray input)
Calculate error with respect to the
current layer.
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray data)
Fit the model to the given data
|
int |
getIndex()
Get the layer index.
|
java.util.Collection<IterationListener> |
getListeners()
Get the iteration listeners for this layer.
|
int[] |
getShape(org.nd4j.linalg.api.ndarray.INDArray x) |
Gradient |
gradient()
Calculate a gradient
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (VAE, RBMs etc)
|
void |
merge(Layer layer,
int batchSize)
Averages the given logistic regression from a mini batch into this layer
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
Raw activations
|
void |
setIndex(int index)
Set the layer index.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, activate, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, clear, computeGradientAndScore, conf, createGradient, derivativeActivation, feedForwardMaskArray, fit, getInput, getInputMiniBatchSize, getMaskArray, getOptimizer, getParam, gradientAndScore, init, initParams, input, iterate, layerConf, layerNameAndIndex, numParams, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setConf, setInput, setInputMiniBatchSize, setListeners, setMaskArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update, validateInputprotected static final org.slf4j.Logger log
protected int index
protected java.util.List<IterationListener> listeners
protected org.nd4j.linalg.api.ndarray.INDArray std
protected org.nd4j.linalg.api.ndarray.INDArray xMu
protected org.nd4j.linalg.api.ndarray.INDArray xHat
public BatchNormalization(NeuralNetConfiguration conf)
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class BaseLayer<BatchNormalization>public Gradient error(org.nd4j.linalg.api.ndarray.INDArray input)
Layererror in interface Layererror in class BaseLayer<BatchNormalization>input - the gradient for the forward layer
If this is the final layer, it will start
with the error from the output.
This is on the user to initialize.public Gradient calcGradient(Gradient layerError, org.nd4j.linalg.api.ndarray.INDArray indArray)
LayercalcGradient in interface LayercalcGradient in class BaseLayer<BatchNormalization>layerError - the layer errorpublic Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<BatchNormalization>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public void merge(Layer layer, int batchSize)
BaseLayermerge in interface Layermerge in class BaseLayer<BatchNormalization>layer - the logistic regression layer to average into this layerbatchSize - the batch sizepublic void fit(org.nd4j.linalg.api.ndarray.INDArray data)
Modelfit in interface Modelfit in class BaseLayer<BatchNormalization>data - the data to fit the model topublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>training - training or test modepublic Gradient gradient()
Modelgradient in interface Modelgradient in class BaseLayer<BatchNormalization>public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
BaseLayerpreOutput in interface LayerpreOutput in class BaseLayer<BatchNormalization>x - the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
LayerpreOutput in interface LayerpreOutput in class BaseLayer<BatchNormalization>x - the input to transformpublic org.nd4j.linalg.api.ndarray.INDArray activate(Layer.TrainingMode training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>training - training or test modepublic org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>input - the input to usetraining - train or test modepublic org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
LayerpreOutput in interface LayerpreOutput in class BaseLayer<BatchNormalization>x - the input to transformpublic Layer transpose()
Layertranspose in interface Layertranspose in class BaseLayer<BatchNormalization>public Layer clone()
Layerclone in interface Layerclone in class BaseLayer<BatchNormalization>public java.util.Collection<IterationListener> getListeners()
LayergetListeners in interface LayergetListeners in class BaseLayer<BatchNormalization>public void setListeners(IterationListener... listeners)
LayersetListeners in interface LayersetListeners in interface ModelsetListeners in class BaseLayer<BatchNormalization>public void setIndex(int index)
LayersetIndex in interface LayersetIndex in class BaseLayer<BatchNormalization>public int getIndex()
LayergetIndex in interface LayergetIndex in class BaseLayer<BatchNormalization>public boolean isPretrainLayer()
Layerpublic int[] getShape(org.nd4j.linalg.api.ndarray.INDArray x)