public static class TransferLearning.Builder
extends java.lang.Object
Constructor and Description |
---|
Builder(MultiLayerNetwork origModel)
Multilayer Network to tweak for transfer learning
|
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
addLayer(Layer layer)
Add layers to the net
Required if layers are removed.
|
MultiLayerNetwork |
build()
Returns a model with the fine tune configuration and specified architecture changes.
|
TransferLearning.Builder |
fineTuneConfiguration(FineTuneConfiguration finetuneConfiguration)
Fine tune configurations specified will overwrite the existing configuration if any
Usage example: specify a learning rate will set specified learning rate on all layers
Refer to the fineTuneConfiguration class for more details
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
Distribution dist)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
Distribution dist,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
Distribution dist,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
WeightInit scheme)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
removeLayersFromOutput(int layerNum)
Remove last "n" layers of the net
At least an output layer must be added back in
|
TransferLearning.Builder |
removeOutputLayer()
Helper method to remove the outputLayer of the net.
|
TransferLearning.Builder |
setFeatureExtractor(int layerNum)
Specify a layer to set as a "feature extractor"
The specified layer and the layers preceding it will be "frozen" with parameters staying constant
|
TransferLearning.Builder |
setInputPreProcessor(int layer,
InputPreProcessor processor)
Specify the preprocessor for the added layers
for cases where they cannot be inferred automatically.
|
public Builder(MultiLayerNetwork origModel)
origModel
- public TransferLearning.Builder fineTuneConfiguration(FineTuneConfiguration finetuneConfiguration)
finetuneConfiguration
- public TransferLearning.Builder setFeatureExtractor(int layerNum)
layerNum
- public TransferLearning.Builder nOutReplace(int layerNum, int nOut, WeightInit scheme)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change toscheme
- Weight Init scheme to use for params in layernum and layernum+1public TransferLearning.Builder nOutReplace(int layerNum, int nOut, Distribution dist)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change todist
- Distribution to use in conjunction with weight init DISTRIBUTION for params in layernum and layernum+1DISTRIBUTION
public TransferLearning.Builder nOutReplace(int layerNum, int nOut, WeightInit scheme, WeightInit schemeNext)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change toscheme
- Weight Init scheme to use for params in the layerNumschemeNext
- Weight Init scheme to use for params in the layerNum+1public TransferLearning.Builder nOutReplace(int layerNum, int nOut, Distribution dist, Distribution distNext)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change todist
- Distribution to use for params in the layerNumdistNext
- Distribution to use for parmas in layerNum+1DISTRIBUTION
public TransferLearning.Builder nOutReplace(int layerNum, int nOut, WeightInit scheme, Distribution distNext)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change toscheme
- Weight init scheme to use for params in layerNumdistNext
- Distribution to use for parmas in layerNum+1DISTRIBUTION
public TransferLearning.Builder nOutReplace(int layerNum, int nOut, Distribution dist, WeightInit schemeNext)
layerNum
- The index of the layer to change nOut ofnOut
- Value of nOut to change todist
- Distribution to use for parmas in layerNumschemeNext
- Weight init scheme to use for params in layerNum+1DISTRIBUTION
public TransferLearning.Builder removeOutputLayer()
public TransferLearning.Builder removeLayersFromOutput(int layerNum)
layerNum
- number of layers to removepublic TransferLearning.Builder addLayer(Layer layer)
layer
- layer conf to add (similar to the NeuralNetConfiguration .list().layer(...)public TransferLearning.Builder setInputPreProcessor(int layer, InputPreProcessor processor)
processor
- to be used on the datapublic MultiLayerNetwork build()