Shortcuts

DeepSpeedPrecisionPlugin

class pytorch_lightning.plugins.precision.DeepSpeedPrecisionPlugin(precision)[source]

Bases: pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin

Precision plugin for DeepSpeed integration.

backward(model, closure_loss, *args, **kwargs)[source]

Performs the actual backpropagation

Parameters
  • model (LightningModule) – the model to be optimized

  • closure_loss (Tensor) – the loss value obtained from the closure

  • optimizer – current optimizer being used. None if using manual optimization

Return type

None

clip_gradients(optimizer, clip_val, gradient_clip_algorithm=<GradClipAlgorithmType.NORM: 'norm'>, model=None)[source]

DeepSpeed handles clipping gradients internally via the training type plugin.

Return type

None

pre_optimizer_step(model, optimizer, optimizer_idx, lambda_closure, **kwargs)[source]

Hook to do something before each optimizer step.

Return type

bool