Shortcuts

TrainingBatchLoop

class pytorch_lightning.loops.batch.TrainingBatchLoop[source]

Bases: pytorch_lightning.loops.base.Loop[List[Union[Dict[int, Dict[str, Any]], Dict[str, Any]]]]

Runs over a single batch of data.

advance(batch, batch_idx)[source]

Runs the train step together with optimization (if necessary) on the current batch split.

Parameters
  • batch (Any) – the current batch to run the training on (this is not the split!)

  • batch_idx (int) – the index of the current batch

Return type

None

connect(optimizer_loop=None, manual_loop=None)[source]

Optionally connect one or multiple loops to this one.

Linked loops should form a tree.

Return type

None

on_run_end()[source]

Hook to be called at the end of the run.

Its return argument is returned from run.

Return type

List[Union[Dict[int, Dict[str, Any]], Dict[str, Any]]]

on_run_start(batch, batch_idx)[source]

Splits the data into tbptt splits.

Parameters
  • batch (Any) – the current batch to run the trainstep on

  • batch_idx (int) – the index of the current batch

Return type

None

reset()[source]

Resets the loop state.

Return type

None

teardown()[source]

Use to release memory etc.

Return type

None

property done: bool

Returns if all batch splits have been processed already.

Return type

bool