Shortcuts

test_tube

Classes

TestTubeLogger

Log to local file system in TensorBoard format but using a nicer folder structure (see full docs).

Test Tube Logger

class pytorch_lightning.loggers.test_tube.TestTubeLogger(save_dir, name='default', description=None, debug=False, version=None, create_git_tag=False, log_graph=False, prefix='')[source]

Bases: pytorch_lightning.loggers.base.LightningLoggerBase

Log to local file system in TensorBoard format but using a nicer folder structure (see full docs).

Install it with pip:

pip install test_tube
from pytorch_lightning import Trainer
from pytorch_lightning.loggers import TestTubeLogger

logger = TestTubeLogger("tt_logs", name="my_exp_name")
trainer = Trainer(logger=logger)

Use the logger anywhere in your LightningModule as follows:

from pytorch_lightning import LightningModule


class LitModel(LightningModule):
    def training_step(self, batch, batch_idx):
        # example
        self.logger.experiment.whatever_method_summary_writer_supports(...)

    def any_lightning_module_function_or_hook(self):
        self.logger.experiment.add_histogram(...)
Parameters
  • save_dir (str) – Save directory

  • name (str) – Experiment name. Defaults to 'default'.

  • description (Optional[str]) – A short snippet about this experiment

  • debug (bool) – If True, it doesn’t log anything.

  • version (Optional[int]) – Experiment version. If version is not specified the logger inspects the save directory for existing versions, then automatically assigns the next available version.

  • create_git_tag (bool) – If True creates a git tag to save the code used in this experiment.

  • log_graph (bool) – Adds the computational graph to tensorboard. This requires that the user has defined the self.example_input_array attribute in their model.

  • prefix (str) – A string to put at the beginning of metric keys.

Raises

ImportError – If required TestTube package is not installed on the device.

close()[source]

Do any cleanup that is necessary to close an experiment.

Return type

None

finalize(status)[source]

Do any processing that is necessary to finalize an experiment.

Parameters

status (str) – Status that the experiment finished with (e.g. success, failed, aborted)

Return type

None

log_graph(model, input_array=None)[source]

Record model graph

Parameters
  • model (LightningModule) – lightning model

  • input_array – input passes to model.forward

log_hyperparams(params)[source]

Record hyperparameters.

Parameters
  • params (Union[Dict[str, Any], Namespace]) – Namespace containing the hyperparameters

  • args – Optional positional arguments, depends on the specific logger being used

  • kwargs – Optional keywoard arguments, depends on the specific logger being used

Return type

None

log_metrics(metrics, step=None)[source]

Records metrics. This method logs metrics as as soon as it received them. If you want to aggregate metrics for one specific step, use the agg_and_log_metrics() method.

Parameters
  • metrics (Dict[str, float]) – Dictionary with metric names as keys and measured quantities as values

  • step (Optional[int]) – Step number at which the metrics should be recorded

Return type

None

save()[source]

Save log data.

Return type

None

property experiment: test_tube.Experiment

Actual TestTube object. To use TestTube features in your LightningModule do the following.

Example:

self.logger.experiment.some_test_tube_function()
Return type

Experiment

property name: str

Gets the experiment name.

Return type

str

Returns

The experiment name if the experiment exists, else the name specified in the constructor.

property save_dir: Optional[str]

Gets the save directory.

Return type

Optional[str]

Returns

The path to the save directory.

property version: int

Gets the experiment version.

Return type

int

Returns

The experiment version if the experiment exists, else the next version.