Shortcuts

TrainingBatchLoop

class pytorch_lightning.loops.batch.TrainingBatchLoop[source]

Bases: pytorch_lightning.loops.base.Loop[List[Union[Dict[int, Dict[str, Any]], Dict[str, Any]]]]

Runs over a single batch of data.

advance(batch, batch_idx)[source]

Runs the train step together with optimization (if necessary) on the current batch split.

Parameters
  • batch – the current batch to run the training on (this is not the split!)

  • batch_idx – the index of the current batch

connect(optimizer_loop=None, manual_loop=None)[source]

Optionally connect one or multiple loops to this one.

Linked loops should form a tree.

Return type

None

on_run_end()[source]

Hook to be called at the end of the run.

Its return argument is returned from run.

Return type

List[Union[Dict[int, Dict[str, Any]], Dict[str, Any]]]

on_run_start(batch, batch_idx)[source]

Splits the data into tbptt splits.

Parameters
  • batch (Any) – the current batch to run the trainstep on

  • batch_idx (int) – the index of the current batch

reset()[source]

Resets the loop state.

Return type

None

teardown()[source]

Use to release memory etc.

Return type

None

property done: bool

Returns if all batch splits have been processed already.

Return type

bool

Read the Docs v: latest
Versions
latest
stable
1.5.3
1.5.2
1.5.1
1.5.0
1.4.9
1.4.8
1.4.7
1.4.6
1.4.5
1.4.4
1.4.3
1.4.2
1.4.1
1.4.0
1.3.8
1.3.7
1.3.6
1.3.5
1.3.4
1.3.3
1.3.2
1.3.1
1.3.0
1.2.10
1.2.8
1.2.7
1.2.6
1.2.5
1.2.4
1.2.3
1.2.2
1.2.1
1.2.0
1.1.8
1.1.7
1.1.6
1.1.5
1.1.4
1.1.3
1.1.2
1.1.1
1.1.0
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
0.10.0
0.9.0
0.8.5
0.8.4
0.8.3
0.8.2
0.8.1
0.8.0
0.7.6
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.3
0.4.9
ipynb-update
docs-search
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.