| Modifier and Type | Method and Description |
|---|---|
SDVariable[] |
SDRNN.lstmLayer(SDVariable x,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
SDVariable[] |
SDRNN.lstmLayer(SDVariable x,
SDVariable cLast,
SDVariable yLast,
SDVariable maxTSLength,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
SDVariable[] |
SDRNN.lstmLayer(String[] names,
SDVariable x,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
SDVariable[] |
SDRNN.lstmLayer(String[] names,
SDVariable x,
SDVariable cLast,
SDVariable yLast,
SDVariable maxTSLength,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
| Constructor and Description |
|---|
LSTMLayer(INDArray x,
INDArray cLast,
INDArray yLast,
INDArray maxTSLength,
LSTMLayerWeights lstmWeights,
LSTMLayerConfig LSTMLayerConfig) |
LSTMLayer(@NonNull SameDiff sameDiff,
SDVariable x,
SDVariable cLast,
SDVariable yLast,
SDVariable maxTSLength,
LSTMLayerWeights weights,
LSTMLayerConfig configuration) |
LSTMLayerBp(@NonNull SameDiff sameDiff,
@NonNull SDVariable x,
SDVariable cLast,
SDVariable yLast,
SDVariable maxTSLength,
@NonNull LSTMLayerWeights weights,
@NonNull LSTMLayerConfig configuration,
SDVariable dLdh,
SDVariable dLdhL,
SDVariable dLdcL) |
| Constructor and Description |
|---|
LSTMLayerOutputs(SDVariable[] outputs,
LSTMLayerConfig lstmLayerConfig) |
| Modifier and Type | Method and Description |
|---|---|
INDArray[] |
NDRNN.lstmLayer(INDArray x,
INDArray cLast,
INDArray yLast,
INDArray maxTSLength,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
INDArray[] |
NDRNN.lstmLayer(INDArray x,
LSTMLayerWeights LSTMLayerWeights,
LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats: for unidirectional: TNS: shapes [timeLength, numExamples, inOutSize] NST: shapes [numExamples, inOutSize, timeLength] NTS: shapes [numExamples, timeLength, inOutSize] for bidirectional: T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX) SUPPORTS following direction modes: FWD: forward BWD: backward BIDIR_SUM: bidirectional sum BIDIR_CONCAT: bidirectional concat BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS) You may use different gate configurations: specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum ("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS") Also this layer supports MKLDNN (DNNL) and cuDNN acceleration |
Copyright © 2021. All rights reserved.