public abstract class BasePretrainNetwork<LayerConfT extends BasePretrainNetwork> extends BaseLayer<LayerConfT>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected java.util.Collection<TrainingListener> |
trainingListeners |
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, maskState, optimizer, params, paramsFlattened, score, solver
Constructor and Description |
---|
BasePretrainNetwork(NeuralNetConfiguration conf) |
BasePretrainNetwork(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
protected Gradient |
createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient,
org.nd4j.linalg.api.ndarray.INDArray vBiasGradient,
org.nd4j.linalg.api.ndarray.INDArray hBiasGradient) |
org.nd4j.linalg.api.ndarray.INDArray |
getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x,
double corruptionLevel)
Corrupts the given input by doing a binomial sampling
given the corruption level
|
int |
numParams()
The number of parameters for the model, for backprop (i.e., excluding visible bias)
|
int |
numParams(boolean backwards)
the number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
java.util.Map<java.lang.String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
Sample the hidden distribution given the visible
|
abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
Sample the visible distribution given the hidden
|
void |
setListeners(java.util.Collection<IterationListener> listeners)
Set the iteration listeners for this layer.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
accumulateScore, activate, activate, activate, activate, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, calcGradient, clear, clone, computeGradientAndScore, conf, createGradient, derivativeActivation, error, feedForwardMaskArray, fit, fit, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, gradient, gradientAndScore, init, initParams, input, iterate, layerConf, layerNameAndIndex, merge, paramTable, preOutput, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, toString, transpose, type, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
isPretrainLayer
protected java.util.Collection<TrainingListener> trainingListeners
public BasePretrainNetwork(NeuralNetConfiguration conf)
public BasePretrainNetwork(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public void setListeners(java.util.Collection<IterationListener> listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
setListeners
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public void setListeners(IterationListener... listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
setListeners
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public org.nd4j.linalg.api.ndarray.INDArray getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x, double corruptionLevel)
x
- the input to corruptcorruptionLevel
- the corruption valueprotected Gradient createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient, org.nd4j.linalg.api.ndarray.INDArray vBiasGradient, org.nd4j.linalg.api.ndarray.INDArray hBiasGradient)
public int numParams(boolean backwards)
Model
numParams
in interface Model
numParams
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
v
- the visible to sample frompublic abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
h
- the hidden to sample fromprotected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public java.util.Map<java.lang.String,org.nd4j.linalg.api.ndarray.INDArray> paramTable(boolean backpropParamsOnly)
Model
paramTable
in interface Model
paramTable
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to
paramsTable())public org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayer
params
in interface Model
params
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public int numParams()
numParams
in interface Model
numParams
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Model
setParams
in interface Model
setParams
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
params
- the parameters for the modelpublic Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)