Shortcuts

DeepSpeedPrecisionPlugin

class pytorch_lightning.plugins.precision.DeepSpeedPrecisionPlugin(precision)[source]

Bases: pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin

Precision plugin for DeepSpeed integration.

backward(model, closure_loss, optimizer, opt_idx, should_accumulate, *args, **kwargs)[source]

performs the actual backpropagation

Parameters
  • model (LightningModule) – the model to be optimized

  • closure_loss (Tensor) – the loss value obtained from the closure

  • optimizer (Optimizer) – the optimizer to perform the step lateron

  • opt_idx (int) – the optimizer’s index

  • should_accumulate (bool) – whether to accumulate gradients or not

Return type

Tensor

clip_gradients(optimizer, clip_val, gradient_clip_algorithm=<GradClipAlgorithmType.NORM: 'norm'>, model=None)[source]

DeepSpeed handles clipping gradients internally via the training type plugin.

Return type

None

pre_optimizer_step(pl_module, optimizer, optimizer_idx, lambda_closure, **kwargs)[source]

Hook to do something before each optimizer step.

Return type

bool