Shortcuts

TorchCheckpointIO

class pytorch_lightning.plugins.io.TorchCheckpointIO[source]

Bases: pytorch_lightning.plugins.io.checkpoint_plugin.CheckpointIO

CheckpointIO that utilizes torch.save() and torch.load() to save and load checkpoints respectively, common for most use cases.

load_checkpoint(path, map_location=<function TorchCheckpointIO.<lambda>>)[source]

Loads checkpoint using torch.load(), with additional handling for fsspec remote loading of files.

Parameters
  • path (Union[str, Path]) – Path to checkpoint

  • map_location (Optional[Callable]) – a function, torch.device, string or a dict specifying how to remap storage

  • locations.

Returns: The loaded checkpoint.

Raises

FileNotFoundError – If path is not found by the fsspec filesystem

Return type

Dict[str, Any]

remove_checkpoint(path)[source]

Remove checkpoint file from the filesystem.

Parameters

path (Union[str, Path]) – Path to checkpoint

Return type

None

save_checkpoint(checkpoint, path, storage_options=None)[source]

Save model/training states as a checkpoint file through state-dump and file-write.

Parameters
  • checkpoint (Dict[str, Any]) – dict containing model and trainer state

  • path (Union[str, Path]) – write-target path

  • storage_options (Optional[Any]) – Optional parameters when saving the model/training states.

Return type

None