Shortcuts

DeepSpeedPrecisionPlugin

class pytorch_lightning.plugins.precision.DeepSpeedPrecisionPlugin(precision)[source]

Bases: pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin

Precision plugin for DeepSpeed integration.

backward(model, closure_loss, optimizer, opt_idx, should_accumulate, *args, **kwargs)[source]

performs the actual backpropagation

Parameters
  • model (LightningModule) – the model to be optimized

  • closure_loss (Tensor) – the loss value obtained from the closure

  • optimizer (Optimizer) – the optimizer to perform the step lateron

  • opt_idx (int) – the optimizer’s index

  • should_accumulate (bool) – whether to accumulate gradients or not

Return type

Tensor

clip_gradients(optimizer, clip_val, gradient_clip_algorithm=<GradClipAlgorithmType.NORM: 'norm'>, model=None)[source]

DeepSpeed handles clipping gradients internally via the training type plugin.

Return type

None

pre_optimizer_step(pl_module, optimizer, optimizer_idx, lambda_closure, **kwargs)[source]

Hook to do something before each optimizer step.

Return type

bool

Read the Docs v: latest
Versions
latest
stable
1.3.1
1.3.0
1.2.10
1.2.9_a
1.2.8
1.2.7
1.2.6
1.2.5
1.2.4
1.2.3
1.2.2
1.2.1
1.2.0
1.1.8
1.1.7
1.1.6
1.1.5
1.1.4
1.1.3
1.1.2
1.1.1
1.1.0
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
0.10.0
0.9.0
0.8.5
0.8.4
0.8.3
0.8.2
0.8.1
0.8.0
0.7.6
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.3
0.4.9
docs-robots
Downloads
pdf
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.