Shortcuts

ApexMixedPrecisionPlugin

class pytorch_lightning.plugins.precision.ApexMixedPrecisionPlugin(amp_level='O2')[source]

Bases: pytorch_lightning.plugins.precision.mixed.MixedPrecisionPlugin

Mixed Precision Plugin based on Nvidia/Apex (https://github.com/NVIDIA/apex)

backward(model, closure_loss, optimizer, opt_idx, should_accumulate, *args, **kwargs)[source]

performs the actual backpropagation

Parameters
  • model (LightningModule) – the model to be optimized

  • closure_loss (Tensor) – the loss value obtained from the closure

  • optimizer (Optimizer) – the optimizer to perform the step lateron

  • opt_idx (int) – the optimizer index

  • should_accumulate (bool) – whether to accumulate gradients or not

Return type

Tensor

dispatch(trainer)[source]

Hook to do something at trainer run_stage starts.

Return type

None

master_params(optimizer)[source]

The master params of the model. Returns the plain model params here. Maybe different in other precision plugins.

Return type

Iterator[Parameter]

pre_optimizer_step(pl_module, optimizer, optimizer_idx, lambda_closure, **kwargs)[source]

always called before the optimizer step.

Return type

bool

static reinit_scheduler_properties(optimizers, schedulers)[source]

Reinitializes schedulers with correct properties

Return type

None