Shortcuts

XLAStatsMonitor

class pytorch_lightning.callbacks.XLAStatsMonitor(verbose=True)[source]

Bases: pytorch_lightning.callbacks.base.Callback

Deprecated since version v1.5: The XLAStatsMonitor callback was deprecated in v1.5 and will be removed in v1.7. Please use the DeviceStatsMonitor callback instead.

Automatically monitors and logs XLA stats during training stage. XLAStatsMonitor is a callback and in order to use it you need to assign a logger in the Trainer.

Parameters

verbose (bool) – Set to True to print average peak and free memory, and epoch time every epoch.

Raises

MisconfigurationException – If not running on TPUs, or Trainer has no logger.

Example:

>>> from pytorch_lightning import Trainer
>>> from pytorch_lightning.callbacks import XLAStatsMonitor
>>> xla_stats = XLAStatsMonitor() 
>>> trainer = Trainer(callbacks=[xla_stats]) 
on_train_epoch_end(trainer, pl_module)[source]

Called when the train epoch ends.

To access all batch outputs at the end of the epoch, either:

  1. Implement training_epoch_end in the LightningModule and access outputs via the module OR

  2. Cache data across train batch hooks inside the callback implementation to post-process in this hook.

Return type

None

on_train_epoch_start(trainer, pl_module)[source]

Called when the train epoch begins.

Return type

None

on_train_start(trainer, pl_module)[source]

Called when the train begins.

Return type

None

Read the Docs v: latest
Versions
latest
stable
1.4.9
1.4.8
1.4.7
1.4.6
1.4.5
1.4.4
1.4.3
1.4.2
1.4.1
1.4.0
1.3.8
1.3.7
1.3.6
1.3.5
1.3.4
1.3.3
1.3.2
1.3.1
1.3.0
1.2.10
1.2.8
1.2.7
1.2.6
1.2.5
1.2.4
1.2.3
1.2.2
1.2.1
1.2.0
1.1.8
1.1.7
1.1.6
1.1.5
1.1.4
1.1.3
1.1.2
1.1.1
1.1.0
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
0.10.0
0.9.0
0.8.5
0.8.4
0.8.3
0.8.2
0.8.1
0.8.0
0.7.6
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.3
0.4.9
ipynb-update
docs-search
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.