Shortcuts

ManualOptimization

class pytorch_lightning.loops.optimization.ManualOptimization[source]

Bases: pytorch_lightning.loops.base.Loop[Dict[str, Any]]

A special loop implementing what is known in Lightning as Manual Optimization where the optimization happens entirely in the training_step() and therefore the user is responsible for back-propagating gradients and making calls to the optimizers.

This loop is a trivial case because it performs only a single iteration (calling directly into the module’s training_step()) and passing through the output(s).

output_result_cls

alias of pytorch_lightning.loops.optimization.manual_loop.ManualResult

advance(batch, batch_idx)[source]

Performs the training step for manual optimization.

Parameters
  • batch (Any) – the current tbptt split of the current batch

  • batch_idx (int) – the index of the current batch

Return type

None

on_run_end()[source]

Returns the result of this loop, i.e., the post-processed outputs from the training step.

Return type

Dict[str, Any]

on_run_start(*_, **__)[source]

Hook to be called as the first thing after entering run (except the state reset).

Accepts all arguments passed to run.

Return type

None

reset()[source]

Resets the internal state of the loop at the beginning of each call to run.

Example:

def reset(self):
    # reset your internal state or add custom logic
    # if you expect run() to be called multiple times
    self.current_iteration = 0
    self.outputs = []
Return type

None

property done: bool

Property indicating when the loop is finished.

Example:

@property
def done(self):
    return self.trainer.global_step >= self.trainer.max_steps
Return type

bool