Shortcuts

XLAStatsMonitor

class pytorch_lightning.callbacks.XLAStatsMonitor(verbose=True)[source]

Bases: pytorch_lightning.callbacks.base.Callback

Deprecated since version v1.5: The XLAStatsMonitor callback was deprecated in v1.5 and will be removed in v1.7. Please use the DeviceStatsMonitor callback instead.

Automatically monitors and logs XLA stats during training stage. XLAStatsMonitor is a callback and in order to use it you need to assign a logger in the Trainer.

Parameters

verbose (bool) – Set to True to print average peak and free memory, and epoch time every epoch.

Raises

MisconfigurationException – If not running on TPUs, or Trainer has no logger.

Example:

>>> from pytorch_lightning import Trainer
>>> from pytorch_lightning.callbacks import XLAStatsMonitor
>>> xla_stats = XLAStatsMonitor() 
>>> trainer = Trainer(callbacks=[xla_stats]) 
on_train_epoch_end(trainer, pl_module)[source]

Called when the train epoch ends.

To access all batch outputs at the end of the epoch, either:

  1. Implement training_epoch_end in the LightningModule and access outputs via the module OR

  2. Cache data across train batch hooks inside the callback implementation to post-process in this hook.

Return type

None

on_train_epoch_start(trainer, pl_module)[source]

Called when the train epoch begins.

Return type

None

on_train_start(trainer, pl_module)[source]

Called when the train begins.

Return type

None