Shortcuts

pytorch_lightning.callbacks.lr_logger module

Learning Rate Logger

Log learning rate for lr schedulers during training

class pytorch_lightning.callbacks.lr_logger.LearningRateLogger[source]

Bases: pytorch_lightning.callbacks.base.Callback

Automatically logs learning rate for learning rate schedulers during training.

Example:

>>> from pytorch_lightning import Trainer
>>> from pytorch_lightning.callbacks import LearningRateLogger
>>> lr_logger = LearningRateLogger()
>>> trainer = Trainer(callbacks=[lr_logger])

Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named Adam, Adam-1 etc. If a optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2 etc. To control naming, pass in a name keyword in the construction of the learning rate schdulers

Example:

def configure_optimizer(self):
    optimizer = torch.optim.Adam(...)
    lr_scheduler = {'scheduler': torch.optim.lr_schedulers.LambdaLR(optimizer, ...)
                    'name': 'my_logging_name'}
    return [optimizer], [lr_scheduler]
_extract_lr(trainer, interval)[source]

Extracts learning rates for lr schedulers and saves information into dict structure.

_find_names(lr_schedulers)[source]
on_batch_start(trainer, pl_module)[source]

Called when the training batch begins.

on_epoch_start(trainer, pl_module)[source]

Called when the epoch begins.

on_train_start(trainer, pl_module)[source]

Called before training, determines unique names for all lr schedulers in the case of multiple of the same type or in the case of multiple parameter groups