LearningRateMonitor¶
-
class
pytorch_lightning.callbacks.
LearningRateMonitor
(logging_interval=None)[source]¶ Bases:
pytorch_lightning.callbacks.base.Callback
Automatically monitor and logs learning rate for learning rate schedulers during training.
- Parameters
logging_interval¶ (
Optional
[str
]) – set to epoch or step to log lr of all optimizers at the same interval, set to None to log at individual interval according to the interval key of each scheduler. Defaults toNone
.
Example:
>>> from pytorch_lightning import Trainer >>> from pytorch_lightning.callbacks import LearningRateMonitor >>> lr_monitor = LearningRateMonitor(logging_interval='step') >>> trainer = Trainer(callbacks=[lr_monitor])
Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named Adam, Adam-1 etc. If a optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2 etc. To control naming, pass in a name keyword in the construction of the learning rate schdulers
Example:
def configure_optimizer(self): optimizer = torch.optim.Adam(...) lr_scheduler = {'scheduler': torch.optim.lr_scheduler.LambdaLR(optimizer, ...) 'name': 'my_logging_name'} return [optimizer], [lr_scheduler]