public static class BatchNormalization.Builder extends FeedForwardLayer.Builder<BatchNormalization.Builder>
Modifier and Type | Field and Description |
---|---|
protected double |
beta |
protected double |
decay |
protected double |
eps |
protected double |
gamma |
protected boolean |
isMinibatch |
protected boolean |
lockGammaBeta |
nIn, nOut
activationFn, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, dropOut, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, layerName, learningRate, learningRatePolicy, learningRateSchedule, momentum, momentumAfter, rho, rmsDecay, updater, weightInit
Constructor and Description |
---|
Builder() |
Builder(boolean lockGammaBeta) |
Builder(double decay,
boolean isMinibatch) |
Builder(double gamma,
double beta) |
Builder(double gamma,
double beta,
boolean lockGammaBeta) |
Modifier and Type | Method and Description |
---|---|
BatchNormalization.Builder |
beta(double beta)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization |
build() |
BatchNormalization.Builder |
decay(double decay)
At test time: we can use a global estimate of the mean and variance, calculated using a moving average
of the batch means/variances.
|
BatchNormalization.Builder |
eps(double eps)
Epsilon value for batch normalization; small floating point value added to variance
(algorithm 1 in http://arxiv.org/pdf/1502.03167v3.pdf) to reduce/avoid underflow issues.
Default: 1e-5 |
BatchNormalization.Builder |
gamma(double gamma)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization.Builder |
lockGammaBeta(boolean lockGammaBeta)
If set to true: lock the gamma and beta parameters to the values for each activation, specified by
gamma(double) and beta(double) . |
BatchNormalization.Builder |
minibatch(boolean minibatch)
If doing minibatch training or not.
|
nIn, nOut
activation, activation, activation, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, dropOut, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, learningRate, learningRateDecayPolicy, learningRateSchedule, momentum, momentumAfter, name, rho, rmsDecay, updater, weightInit
protected double decay
protected double eps
protected boolean isMinibatch
protected boolean lockGammaBeta
protected double gamma
protected double beta
public Builder(double decay, boolean isMinibatch)
public Builder(double gamma, double beta)
public Builder(double gamma, double beta, boolean lockGammaBeta)
public Builder(boolean lockGammaBeta)
public Builder()
public BatchNormalization.Builder minibatch(boolean minibatch)
minibatch
- Minibatch parameterpublic BatchNormalization.Builder gamma(double gamma)
lockGammaBeta(boolean)
. Value is not used otherwise.gamma
- Gamma parameter for all activations, used only with locked gamma/beta configuration modepublic BatchNormalization.Builder beta(double beta)
lockGammaBeta(boolean)
. Value is not used otherwise.beta
- Beta parameter for all activations, used only with locked gamma/beta configuration modepublic BatchNormalization.Builder eps(double eps)
eps
- Epsilon values to usepublic BatchNormalization.Builder decay(double decay)
decay
- Decay value to use for global stats calculationpublic BatchNormalization.Builder lockGammaBeta(boolean lockGammaBeta)
gamma(double)
and beta(double)
. Default: false -> learn gamma and beta parameter values
during network training.lockGammaBeta
- If true: use fixed beta/gamma values. False: learn duringpublic BatchNormalization build()
build
in class Layer.Builder<BatchNormalization.Builder>