pytorch_lightning.core.saving module¶
-
class
pytorch_lightning.core.saving.
ModelIO
[source]¶ Bases:
object
-
on_hpc_load
(checkpoint)[source]¶ Hook to do whatever you need right before Slurm manager loads the model.
-
on_hpc_save
(checkpoint)[source]¶ Hook to do whatever you need right before Slurm manager saves the model.
-
Load hparams from a file.
>>> hparams = Namespace(batch_size=32, learning_rate=0.001, data_root='./any/path/here') >>> path_csv = './testing-hparams.csv' >>> save_hparams_to_tags_csv(path_csv, hparams) >>> hparams_new = load_hparams_from_tags_csv(path_csv) >>> vars(hparams) == hparams_new True >>> os.remove(path_csv)
-
pytorch_lightning.core.saving.
load_hparams_from_yaml
(config_yaml)[source]¶ Load hparams from a file.
>>> hparams = Namespace(batch_size=32, learning_rate=0.001, data_root='./any/path/here') >>> path_yaml = './testing-hparams.yaml' >>> save_hparams_to_yaml(path_yaml, hparams) >>> hparams_new = load_hparams_from_yaml(path_yaml) >>> vars(hparams) == hparams_new True >>> os.remove(path_yaml)
- Return type
None
-
pytorch_lightning.core.saving.
update_hparams
(hparams, updates)[source]¶ Overrides hparams with new values
>>> hparams = {'c': 4} >>> update_hparams(hparams, {'a': {'b': 2}, 'c': 1}) >>> hparams['a']['b'], hparams['c'] (2, 1) >>> update_hparams(hparams, {'a': {'b': 4}, 'c': 7}) >>> hparams['a']['b'], hparams['c'] (4, 7)