Shortcuts

Experiment Logging

Comet.ml

Comet.ml is a third-party logger. To use CometLogger as your logger do the following. First, install the package:

pip install comet-ml

Then configure the logger and pass it to the Trainer:

import os
from pytorch_lightning.loggers import CometLogger
comet_logger = CometLogger(
    api_key=os.environ.get('COMET_API_KEY'),
    workspace=os.environ.get('COMET_WORKSPACE'),  # Optional
    save_dir='.',  # Optional
    project_name='default_project',  # Optional
    rest_api_key=os.environ.get('COMET_REST_API_KEY'),  # Optional
    experiment_name='default'  # Optional
)
trainer = Trainer(logger=comet_logger)

The CometLogger is available anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        self.logger.experiment.add_image('generated_images', some_img, 0)

See also

CometLogger docs.


MLflow

MLflow is a third-party logger. To use MLFlowLogger as your logger do the following. First, install the package:

pip install mlflow

Then configure the logger and pass it to the Trainer:

from pytorch_lightning.loggers import MLFlowLogger
mlf_logger = MLFlowLogger(
    experiment_name="default",
    tracking_uri="file:./ml-runs"
)
trainer = Trainer(logger=mlf_logger)

See also

MLFlowLogger docs.


Neptune.ai

Neptune.ai is a third-party logger. To use NeptuneLogger as your logger do the following. First, install the package:

pip install neptune-client

Then configure the logger and pass it to the Trainer:

from pytorch_lightning.loggers import NeptuneLogger

neptune_logger = NeptuneLogger(
    api_key='ANONYMOUS',  # replace with your own
    project_name='shared/pytorch-lightning-integration',
    experiment_name='default',  # Optional,
    params={'max_epochs': 10},  # Optional,
    tags=['pytorch-lightning', 'mlp'],  # Optional,
)
trainer = Trainer(logger=neptune_logger)

The NeptuneLogger is available anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        self.logger.experiment.add_image('generated_images', some_img, 0)

See also

NeptuneLogger docs.


Tensorboard

To use TensorBoard as your logger do the following.

from pytorch_lightning.loggers import TensorBoardLogger
logger = TensorBoardLogger('tb_logs', name='my_model')
trainer = Trainer(logger=logger)

The TensorBoardLogger is available anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        self.logger.experiment.add_image('generated_images', some_img, 0)

See also

TensorBoardLogger docs.


Test Tube

Test Tube is a TensorBoard logger but with nicer file structure. To use TestTubeLogger as your logger do the following. First, install the package:

pip install test_tube

Then configure the logger and pass it to the Trainer:

from pytorch_lightning.loggers import TestTubeLogger
logger = TestTubeLogger('tb_logs', name='my_model')
trainer = Trainer(logger=logger)

The TestTubeLogger is available anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        self.logger.experiment.add_image('generated_images', some_img, 0)

See also

TestTubeLogger docs.


Weights and Biases

Weights and Biases is a third-party logger. To use WandbLogger as your logger do the following. First, install the package:

pip install wandb

Then configure the logger and pass it to the Trainer:

from pytorch_lightning.loggers import WandbLogger
wandb_logger = WandbLogger(offline=True)
trainer = Trainer(logger=wandb_logger)

The WandbLogger is available anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        self.logger.experiment.log({
             "generated_images": [wandb.Image(some_img, caption="...")]
        })

See also

WandbLogger docs.


Multiple Loggers

Lightning supports the use of multiple loggers, just pass a list to the Trainer.

from pytorch_lightning.loggers import TensorBoardLogger, TestTubeLogger
logger1 = TensorBoardLogger('tb_logs', name='my_model')
logger2 = TestTubeLogger('tb_logs', name='my_model')
trainer = Trainer(logger=[logger1, logger2])

The loggers are available as a list anywhere except __init__ in your LightningModule.

class MyModule(LightningModule):
    def any_lightning_module_function_or_hook(self):
        some_img = fake_image()
        # Option 1
        self.logger.experiment[0].add_image('generated_images', some_img, 0)
        # Option 2
        self.logger[0].experiment.add_image('generated_images', some_img, 0)