pytorch_lightning.callbacks.lr_logger module¶
Learning Rate Logger¶
Log learning rate for lr schedulers during training
-
class
pytorch_lightning.callbacks.lr_logger.
LearningRateLogger
[source]¶ Bases:
pytorch_lightning.callbacks.base.Callback
Automatically logs learning rate for learning rate schedulers during training.
Example:
>>> from pytorch_lightning import Trainer >>> from pytorch_lightning.callbacks import LearningRateLogger >>> lr_logger = LearningRateLogger() >>> trainer = Trainer(callbacks=[lr_logger])
Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named Adam, Adam-1 etc. If a optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2 etc. To control naming, pass in a name keyword in the construction of the learning rate schdulers
Example:
def configure_optimizer(self): optimizer = torch.optim.Adam(...) lr_scheduler = {'scheduler': torch.optim.lr_schedulers.LambdaLR(optimizer, ...) 'name': 'my_logging_name'} return [optimizer], [lr_scheduler]