| Package | Description |
|---|---|
| org.nd4j.autodiff.samediff.internal | |
| org.nd4j.linalg.learning | |
| org.nd4j.linalg.learning.config |
| Modifier and Type | Field and Description |
|---|---|
protected Map<String,GradientUpdater> |
TrainingSession.updaters |
| Modifier and Type | Method and Description |
|---|---|
Loss |
TrainingSession.trainingIteration(TrainingConfig config,
Map<String,INDArray> placeholders,
Set<String> paramsToTrain,
Map<String,GradientUpdater> updaters,
MultiDataSet batch,
List<String> lossVariables,
List<Listener> listeners,
At at)
Perform one iteration of training - i.e., do forward and backward passes, and update the parameters
|
| Modifier and Type | Class and Description |
|---|---|
class |
AdaBeliefUpdater |
class |
AdaDeltaUpdater |
class |
AdaGradUpdater |
class |
AdaMaxUpdater
The AdaMax updater, a variant of Adam.
|
class |
AdamUpdater
The Adam updater.
|
class |
AMSGradUpdater |
class |
NadamUpdater
The Nadam updater.
|
class |
NesterovsUpdater |
class |
NoOpUpdater |
class |
RmsPropUpdater |
class |
SgdUpdater
SGD updater applies a learning rate only
|
| Modifier and Type | Method and Description |
|---|---|
GradientUpdater |
AdaBelief.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
AdaDelta.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
AdaGrad.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
Adam.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
AdaMax.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
AMSGrad.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
IUpdater.instantiate(INDArray viewArray,
boolean initializeViewArray)
Create a new gradient updater
|
GradientUpdater |
Nadam.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
Nesterovs.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
NoOp.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
RmsProp.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
Sgd.instantiate(INDArray viewArray,
boolean initializeViewArray) |
GradientUpdater |
AdaBelief.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
AdaDelta.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
AdaGrad.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
Adam.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
AdaMax.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
AMSGrad.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
IUpdater.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
Nadam.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
Nesterovs.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
NoOp.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
RmsProp.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
GradientUpdater |
Sgd.instantiate(Map<String,INDArray> updaterState,
boolean initializeStateArrays) |
Copyright © 2021. All rights reserved.