Class | Description |
---|---|
ActivationLayer | |
ActivationLayer.Builder | |
AutoEncoder |
Autoencoder.
|
AutoEncoder.Builder | |
BaseOutputLayer | |
BaseOutputLayer.Builder<T extends BaseOutputLayer.Builder<T>> | |
BasePretrainNetwork | |
BasePretrainNetwork.Builder<T extends BasePretrainNetwork.Builder<T>> | |
BaseRecurrentLayer | |
BaseRecurrentLayer.Builder<T extends BaseRecurrentLayer.Builder<T>> | |
BatchNormalization |
Batch normalization configuration
|
BatchNormalization.Builder | |
CenterLossOutputLayer |
Center loss is similar to triplet loss except that it enforces
intraclass consistency and doesn't require feed forward of multiple
examples.
|
CenterLossOutputLayer.Builder | |
Convolution1DLayer |
1D (temporal) convolutional layer.
|
Convolution1DLayer.Builder | |
ConvolutionLayer | |
ConvolutionLayer.BaseConvBuilder<T extends ConvolutionLayer.BaseConvBuilder<T>> | |
ConvolutionLayer.Builder | |
DenseLayer |
Dense layer: fully connected feed forward layer trainable by backprop.
|
DenseLayer.Builder | |
DropoutLayer | |
DropoutLayer.Builder | |
EmbeddingLayer |
Embedding layer: feed-forward layer that expects single integers per example as input (class numbers, in range 0 to numClass-1)
as input.
|
EmbeddingLayer.Builder | |
FeedForwardLayer |
Created by jeffreytang on 7/21/15.
|
FeedForwardLayer.Builder<T extends FeedForwardLayer.Builder<T>> | |
GlobalPoolingLayer |
Global pooling layer - used to do pooling over time for RNNs, and 2d pooling for CNNs.
Supports the following PoolingType s: SUM, AVG, MAX, PNORMGlobal pooling layer can also handle mask arrays when dealing with variable length inputs. |
GlobalPoolingLayer.Builder | |
GravesBidirectionalLSTM |
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
GravesBidirectionalLSTM.Builder | |
GravesLSTM |
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
GravesLSTM.Builder | |
InputTypeUtil |
Utilities for calculating input types
|
Layer |
A neural network layer.
|
Layer.Builder<T extends Layer.Builder<T>> | |
LayerBuilderTest | |
LayerConfigTest | |
LayerConfigValidationTest | |
LayerValidation |
Created by Alex on 22/02/2017.
|
LocalResponseNormalization |
Created by nyghtowl on 10/29/15.
|
LocalResponseNormalization.Builder | |
LossLayer |
LossLayer is a flexible output "layer" that performs a loss function on
an input without MLP logic.
|
LossLayer.Builder | |
OutputLayer |
Output layer with different objective co-occurrences for different objectives.
|
OutputLayer.Builder | |
RBM |
Restricted Boltzmann Machine.
|
RBM.Builder | |
RnnOutputLayer | |
RnnOutputLayer.Builder | |
Subsampling1DLayer |
1D (temporal) subsampling layer.
|
Subsampling1DLayer.Builder | |
SubsamplingLayer |
Subsampling layer also referred to as pooling in convolution neural nets
Supports the following pooling types:
MAX
AVG
NON
|
SubsamplingLayer.BaseSubsamplingBuilder<T extends SubsamplingLayer.BaseSubsamplingBuilder<T>> | |
SubsamplingLayer.Builder | |
ZeroPaddingLayer |
Zero padding layer for convolutional neural networks.
|
ZeroPaddingLayer.Builder |
Enum | Description |
---|---|
ConvolutionLayer.AlgoMode | |
PoolingType |
Created by Alex on 17/01/2017.
|
RBM.HiddenUnit | |
RBM.VisibleUnit | |
SubsamplingLayer.PoolingType |