public static class NeuralNetConfiguration.Builder
extends java.lang.Object
implements java.lang.Cloneable
Modifier and Type | Field and Description |
---|---|
protected org.nd4j.linalg.activations.IActivation |
activationFn |
protected double |
adamMeanDecay |
protected double |
adamVarDecay |
protected double |
biasInit |
protected double |
biasLearningRate |
protected ConvolutionMode |
convolutionMode |
protected Distribution |
dist |
protected double |
dropOut |
protected double |
epsilon |
protected GradientNormalization |
gradientNormalization |
protected double |
gradientNormalizationThreshold |
protected double |
l1 |
protected double |
l1Bias |
protected double |
l2 |
protected double |
l2Bias |
protected Layer |
layer |
protected double |
leakyreluAlpha
Deprecated.
|
protected double |
learningRate |
protected LearningRatePolicy |
learningRatePolicy |
protected java.util.Map<java.lang.Integer,java.lang.Double> |
learningRateSchedule |
protected double |
lrPolicyDecayRate |
protected double |
lrPolicyPower |
protected double |
lrPolicySteps |
protected double |
lrScoreBasedDecay |
protected int |
maxNumLineSearchIterations |
protected boolean |
miniBatch |
protected boolean |
minimize |
protected double |
momentum |
protected java.util.Map<java.lang.Integer,java.lang.Double> |
momentumSchedule |
protected int |
numIterations |
protected OptimizationAlgorithm |
optimizationAlgo |
protected boolean |
pretrain |
protected double |
rho |
protected double |
rmsDecay |
protected long |
seed |
protected StepFunction |
stepFunction |
protected Updater |
updater |
protected boolean |
useDropConnect |
protected boolean |
useRegularization |
protected WeightInit |
weightInit |
Constructor and Description |
---|
Builder() |
Builder(NeuralNetConfiguration newConf) |
Modifier and Type | Method and Description |
---|---|
NeuralNetConfiguration.Builder |
activation(org.nd4j.linalg.activations.Activation activation)
Activation function / neuron non-linearity
|
NeuralNetConfiguration.Builder |
activation(org.nd4j.linalg.activations.IActivation activationFunction)
Activation function / neuron non-linearity
|
NeuralNetConfiguration.Builder |
activation(java.lang.String activationFunction)
Deprecated.
Use
activation(Activation) or
@activation(IActivation) |
NeuralNetConfiguration.Builder |
adamMeanDecay(double adamMeanDecay)
Mean decay rate for Adam updater.
|
NeuralNetConfiguration.Builder |
adamVarDecay(double adamVarDecay)
Variance decay rate for Adam updater.
|
NeuralNetConfiguration.Builder |
biasInit(double biasInit)
Constant for bias initialization.
|
NeuralNetConfiguration.Builder |
biasLearningRate(double biasLearningRate)
Bias learning rate.
|
NeuralNetConfiguration |
build()
Return a configuration based on this builder
|
NeuralNetConfiguration.Builder |
clone() |
NeuralNetConfiguration.Builder |
convolutionMode(ConvolutionMode convolutionMode) |
NeuralNetConfiguration.Builder |
dist(Distribution dist)
Distribution to sample initial weights from.
|
NeuralNetConfiguration.Builder |
dropOut(double dropOut)
Dropout probability.
|
NeuralNetConfiguration.Builder |
epsilon(double epsilon)
Epsilon value for updaters: Adam, RMSProp, Adagrad, Adadelta
Default values:
Adam.DEFAULT_ADAM_EPSILON , RmsProp.DEFAULT_RMSPROP_EPSILON , AdaGrad.DEFAULT_ADAGRAD_EPSILON ,
AdaDelta.DEFAULT_ADADELTA_EPSILON |
NeuralNetConfiguration.Builder |
gradientNormalization(GradientNormalization gradientNormalization)
Gradient normalization strategy.
|
NeuralNetConfiguration.Builder |
gradientNormalizationThreshold(double threshold)
Threshold for gradient normalization, only used for GradientNormalization.ClipL2PerLayer,
GradientNormalization.ClipL2PerParamType, and GradientNormalization.ClipElementWiseAbsoluteValue
Not used otherwise. L2 threshold for first two types of clipping, or absolute value threshold for last type of clipping. |
ComputationGraphConfiguration.GraphBuilder |
graphBuilder()
Create a GraphBuilder (for creating a ComputationGraphConfiguration).
|
NeuralNetConfiguration.Builder |
iterations(int numIterations)
Number of optimization iterations.
|
NeuralNetConfiguration.Builder |
l1(double l1)
L1 regularization coefficient for the weights.
|
NeuralNetConfiguration.Builder |
l1Bias(double l1Bias)
L1 regularization coefficient for the bias.
|
NeuralNetConfiguration.Builder |
l2(double l2)
L2 regularization coefficient for the weights.
|
NeuralNetConfiguration.Builder |
l2Bias(double l2Bias)
L2 regularization coefficient for the bias.
|
NeuralNetConfiguration.Builder |
layer(Layer layer)
Layer class.
|
NeuralNetConfiguration.Builder |
leakyreluAlpha(double leakyreluAlpha)
Deprecated.
Use
activation(IActivation) with leaky relu, setting alpha value directly in constructor. |
NeuralNetConfiguration.Builder |
learningRate(double learningRate)
Learning rate.
|
NeuralNetConfiguration.Builder |
learningRateDecayPolicy(LearningRatePolicy policy)
Learning rate decay policy.
|
NeuralNetConfiguration.Builder |
learningRateSchedule(java.util.Map<java.lang.Integer,java.lang.Double> learningRateSchedule)
Learning rate schedule.
|
NeuralNetConfiguration.Builder |
learningRateScoreBasedDecayRate(double lrScoreBasedDecay)
Rate to decrease learningRate by when the score stops improving.
|
NeuralNetConfiguration.ListBuilder |
list()
Create a ListBuilder (for creating a MultiLayerConfiguration)
Usage: |
NeuralNetConfiguration.ListBuilder |
list(Layer... layers)
Create a ListBuilder (for creating a MultiLayerConfiguration) with the specified layers
Usage: |
NeuralNetConfiguration.Builder |
lrPolicyDecayRate(double lrPolicyDecayRate)
Set the decay rate for the learning rate decay policy.
|
NeuralNetConfiguration.Builder |
lrPolicyPower(double lrPolicyPower)
Set the power used for learning rate inverse policy.
|
NeuralNetConfiguration.Builder |
lrPolicySteps(double lrPolicySteps)
Set the number of steps used for learning decay rate steps policy.
|
NeuralNetConfiguration.Builder |
maxNumLineSearchIterations(int maxNumLineSearchIterations)
Maximum number of line search iterations.
|
NeuralNetConfiguration.Builder |
miniBatch(boolean miniBatch)
Process input as minibatch vs full dataset.
|
NeuralNetConfiguration.Builder |
minimize(boolean minimize)
Objective function to minimize or maximize cost function
Default set to minimize true.
|
NeuralNetConfiguration.Builder |
momentum(double momentum)
Momentum rate
Used only when Updater is set to
Updater.NESTEROVS |
NeuralNetConfiguration.Builder |
momentumAfter(java.util.Map<java.lang.Integer,java.lang.Double> momentumAfter)
Momentum schedule.
|
NeuralNetConfiguration.Builder |
optimizationAlgo(OptimizationAlgorithm optimizationAlgo)
Optimization algorithm to use.
|
NeuralNetConfiguration.Builder |
regularization(boolean useRegularization)
Whether to use regularization (l1, l2, dropout, etc
|
NeuralNetConfiguration.Builder |
rho(double rho)
Ada delta coefficient
|
NeuralNetConfiguration.Builder |
rmsDecay(double rmsDecay)
Decay rate for RMSProp.
|
NeuralNetConfiguration.Builder |
seed(int seed)
Random number generator seed.
|
NeuralNetConfiguration.Builder |
seed(long seed)
Random number generator seed.
|
NeuralNetConfiguration.Builder |
stepFunction(StepFunction stepFunction)
Step function to apply for back track line search.
|
NeuralNetConfiguration.Builder |
updater(Updater updater)
Gradient updater.
|
NeuralNetConfiguration.Builder |
useDropConnect(boolean useDropConnect)
Use drop connect: multiply the weight by a binomial sampling wrt the dropout probability.
|
NeuralNetConfiguration.Builder |
weightInit(WeightInit weightInit)
Weight initialization scheme.
|
protected org.nd4j.linalg.activations.IActivation activationFn
protected WeightInit weightInit
protected double biasInit
protected Distribution dist
protected double learningRate
protected double biasLearningRate
protected java.util.Map<java.lang.Integer,java.lang.Double> learningRateSchedule
protected double lrScoreBasedDecay
protected double l1
protected double l2
protected double l1Bias
protected double l2Bias
protected double dropOut
protected Updater updater
protected double momentum
protected java.util.Map<java.lang.Integer,java.lang.Double> momentumSchedule
protected double epsilon
protected double rho
protected double rmsDecay
protected double adamMeanDecay
protected double adamVarDecay
protected Layer layer
@Deprecated protected double leakyreluAlpha
protected boolean miniBatch
protected int numIterations
protected int maxNumLineSearchIterations
protected long seed
protected boolean useRegularization
protected OptimizationAlgorithm optimizationAlgo
protected StepFunction stepFunction
protected boolean useDropConnect
protected boolean minimize
protected GradientNormalization gradientNormalization
protected double gradientNormalizationThreshold
protected LearningRatePolicy learningRatePolicy
protected double lrPolicyDecayRate
protected double lrPolicySteps
protected double lrPolicyPower
protected boolean pretrain
protected ConvolutionMode convolutionMode
public Builder()
public Builder(NeuralNetConfiguration newConf)
public NeuralNetConfiguration.Builder miniBatch(boolean miniBatch)
public NeuralNetConfiguration.Builder useDropConnect(boolean useDropConnect)
dropOut(double)
; this is the probability of retaining a weightuseDropConnect
- whether to use drop connect or notpublic NeuralNetConfiguration.Builder minimize(boolean minimize)
public NeuralNetConfiguration.Builder maxNumLineSearchIterations(int maxNumLineSearchIterations)
maxNumLineSearchIterations
- > 0public NeuralNetConfiguration.Builder layer(Layer layer)
public NeuralNetConfiguration.Builder stepFunction(StepFunction stepFunction)
public NeuralNetConfiguration.ListBuilder list()
.list()
.layer(0,new DenseLayer.Builder()...build())
...
.layer(n,new OutputLayer.Builder()...build())
public NeuralNetConfiguration.ListBuilder list(Layer... layers)
.list(
new DenseLayer.Builder()...build(),
...,
new OutputLayer.Builder()...build())
layers
- The layer configurations for the networkpublic ComputationGraphConfiguration.GraphBuilder graphBuilder()
public NeuralNetConfiguration.Builder iterations(int numIterations)
public NeuralNetConfiguration.Builder seed(int seed)
public NeuralNetConfiguration.Builder seed(long seed)
public NeuralNetConfiguration.Builder optimizationAlgo(OptimizationAlgorithm optimizationAlgo)
optimizationAlgo
- Optimization algorithm to use when trainingpublic NeuralNetConfiguration.Builder regularization(boolean useRegularization)
public NeuralNetConfiguration.Builder clone()
clone
in class java.lang.Object
@Deprecated public NeuralNetConfiguration.Builder activation(java.lang.String activationFunction)
activation(Activation)
or
@activation(IActivation)
public NeuralNetConfiguration.Builder activation(org.nd4j.linalg.activations.IActivation activationFunction)
activation(Activation)
public NeuralNetConfiguration.Builder activation(org.nd4j.linalg.activations.Activation activation)
@Deprecated public NeuralNetConfiguration.Builder leakyreluAlpha(double leakyreluAlpha)
activation(IActivation)
with leaky relu, setting alpha value directly in constructor.public NeuralNetConfiguration.Builder weightInit(WeightInit weightInit)
WeightInit
public NeuralNetConfiguration.Builder biasInit(double biasInit)
biasInit
- Constant for bias initializationpublic NeuralNetConfiguration.Builder dist(Distribution dist)
public NeuralNetConfiguration.Builder learningRate(double learningRate)
public NeuralNetConfiguration.Builder biasLearningRate(double biasLearningRate)
public NeuralNetConfiguration.Builder learningRateSchedule(java.util.Map<java.lang.Integer,java.lang.Double> learningRateSchedule)
public NeuralNetConfiguration.Builder learningRateScoreBasedDecayRate(double lrScoreBasedDecay)
public NeuralNetConfiguration.Builder l1(double l1)
public NeuralNetConfiguration.Builder l2(double l2)
public NeuralNetConfiguration.Builder l1Bias(double l1Bias)
public NeuralNetConfiguration.Builder l2Bias(double l2Bias)
public NeuralNetConfiguration.Builder dropOut(double dropOut)
dropOut
- Dropout probability (probability of retaining an activation)public NeuralNetConfiguration.Builder momentum(double momentum)
Updater.NESTEROVS
public NeuralNetConfiguration.Builder momentumAfter(java.util.Map<java.lang.Integer,java.lang.Double> momentumAfter)
Updater.NESTEROVS
public NeuralNetConfiguration.Builder updater(Updater updater)
Updater
public NeuralNetConfiguration.Builder rho(double rho)
rho
- public NeuralNetConfiguration.Builder epsilon(double epsilon)
Adam.DEFAULT_ADAM_EPSILON
, RmsProp.DEFAULT_RMSPROP_EPSILON
, AdaGrad.DEFAULT_ADAGRAD_EPSILON
,
AdaDelta.DEFAULT_ADADELTA_EPSILON
epsilon
- Epsilon value to use for adagrad orpublic NeuralNetConfiguration.Builder rmsDecay(double rmsDecay)
public NeuralNetConfiguration.Builder adamMeanDecay(double adamMeanDecay)
public NeuralNetConfiguration.Builder adamVarDecay(double adamVarDecay)
public NeuralNetConfiguration.Builder gradientNormalization(GradientNormalization gradientNormalization)
gradientNormalization
- Type of normalization to use. Defaults to None.GradientNormalization
public NeuralNetConfiguration.Builder gradientNormalizationThreshold(double threshold)
public NeuralNetConfiguration.Builder learningRateDecayPolicy(LearningRatePolicy policy)
policy
- Type of policy to use. Defaults to None.public NeuralNetConfiguration.Builder lrPolicyDecayRate(double lrPolicyDecayRate)
lrPolicyDecayRate
- rate.public NeuralNetConfiguration.Builder lrPolicySteps(double lrPolicySteps)
lrPolicySteps
- number of stepspublic NeuralNetConfiguration.Builder lrPolicyPower(double lrPolicyPower)
lrPolicyPower
- powerpublic NeuralNetConfiguration.Builder convolutionMode(ConvolutionMode convolutionMode)
public NeuralNetConfiguration build()