Shortcuts

SingleHPUStrategy

class pytorch_lightning.strategies.SingleHPUStrategy(device='hpu', accelerator=None, checkpoint_io=None, precision_plugin=None)[source]

Bases: pytorch_lightning.strategies.single_device.SingleDeviceStrategy

Strategy for training on single HPU device.

model_to_device()[source]

Moves the model to the correct device.

Return type

None

optimizer_step(optimizer, opt_idx, closure, model=None, **kwargs)[source]

Performs the actual optimizer step.

Parameters
  • optimizer (Optimizer) – the optimizer performing the step

  • opt_idx (int) – index of the current optimizer

  • closure (Callable[[], Any]) – closure calculating the loss value

  • model (Union[LightningModule, Module, None]) – reference to the model, optionally defining optimizer step related hooks

  • **kwargs – Any extra arguments to optimizer.step

Return type

Any

setup(trainer)[source]

Setup plugins for the trainer fit and creates optimizers.

Parameters

trainer (Trainer) – the trainer instance

Return type

None

setup_optimizers(trainer)[source]

Creates optimizers and schedulers.

Parameters

trainer (Trainer) – the Trainer, these optimizers should be connected to

Return type

None