Shortcuts

pytorch_lightning.trainer.callback_config module

class pytorch_lightning.trainer.callback_config.TrainerCallbackConfigMixin[source]

Bases: abc.ABC

configure_checkpoint_callback(checkpoint_callback)[source]

Weight path set in this priority: Checkpoint_callback’s path (if passed in). User provided weights_saved_path Otherwise use os.getcwd()

configure_early_stopping(early_stop_callback)[source]
configure_progress_bar(refresh_rate=1, process_position=0)[source]
abstract is_overridden(*args)[source]

Warning: this is just empty shell for code implemented in other class.

abstract save_checkpoint(*args)[source]

Warning: this is just empty shell for code implemented in other class.

callbacks: List[Callback] = None[source]
checkpoint_callback: Optional[ModelCheckpoint] = None[source]
ckpt_path: str = None[source]
default_root_dir: str = None[source]
logger: LightningLoggerBase = None[source]
abstract property slurm_job_id[source]

this is just empty shell for code implemented in other class.

Type

Warning

Return type

int

weights_save_path: Optional[str] = None[source]