public static class LossLayer.Builder extends BaseOutputLayer.Builder<LossLayer.Builder>
lossFnnIn, nOutactivationFn, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, dropOut, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, layerName, learningRate, learningRatePolicy, learningRateSchedule, momentum, momentumAfter, rho, rmsDecay, updater, weightInit| Constructor and Description |
|---|
Builder() |
Builder(org.nd4j.linalg.lossfunctions.ILossFunction lossFunction) |
Builder(org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction lossFunction) |
| Modifier and Type | Method and Description |
|---|---|
LossLayer |
build() |
LossLayer.Builder |
nIn(int nIn) |
LossLayer.Builder |
nOut(int nOut) |
lossFunction, lossFunctionactivation, activation, activation, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, dropOut, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, learningRate, learningRateDecayPolicy, learningRateSchedule, momentum, momentumAfter, name, rho, rmsDecay, updater, weightInitpublic Builder()
public Builder(org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction lossFunction)
public Builder(org.nd4j.linalg.lossfunctions.ILossFunction lossFunction)
public LossLayer.Builder nIn(int nIn)
nIn in class FeedForwardLayer.Builder<LossLayer.Builder>public LossLayer.Builder nOut(int nOut)
nOut in class FeedForwardLayer.Builder<LossLayer.Builder>public LossLayer build()
build in class Layer.Builder<LossLayer.Builder>