profilers¶
Classes
Specification of a profiler. |
|
This profiler uses Python’s cProfiler to record more detailed information about time spent in each function call recorded during a given action. |
|
If you wish to write a custom profiler, you should inherit from this class. |
|
This class should be used when you don’t want the (small) overhead of profiling. |
|
This profiler simply records the duration of actions (in seconds) and reports the mean duration of each action and the total time spent over the entire training run. |
Profiler to check if there are any bottlenecks in your code.
-
class
pytorch_lightning.profiler.profilers.
AbstractProfiler
[source]¶ Bases:
abc.ABC
Specification of a profiler.
-
abstract
setup
(**kwargs)[source]¶ Execute arbitrary pre-profiling set-up steps as defined by subclass.
- Return type
-
abstract
-
class
pytorch_lightning.profiler.profilers.
AdvancedProfiler
(dirpath=None, filename=None, line_count_restriction=1.0, output_filename=None)[source]¶ Bases:
pytorch_lightning.profiler.profilers.BaseProfiler
This profiler uses Python’s cProfiler to record more detailed information about time spent in each function call recorded during a given action. The output is quite verbose and you should only use this if you want very detailed reports.
- Parameters
dirpath¶ (
Union
[str
,Path
,None
]) – Directory path for thefilename
. Ifdirpath
isNone
butfilename
is present, thetrainer.log_dir
(fromTensorBoardLogger
) will be used.filename¶ (
Optional
[str
]) – If present, filename where the profiler results will be saved instead of printing to stdout. The.txt
extension will be used automatically.line_count_restriction¶ (
float
) – this can be used to limit the number of functions reported for each action. either an integer (to select a count of lines), or a decimal fraction between 0.0 and 1.0 inclusive (to select a percentage of lines)
- Raises
ValueError – If you attempt to stop recording an action which was never started.
-
class
pytorch_lightning.profiler.profilers.
BaseProfiler
(dirpath=None, filename=None, output_filename=None)[source]¶ Bases:
pytorch_lightning.profiler.profilers.AbstractProfiler
If you wish to write a custom profiler, you should inherit from this class.
-
profile
(action_name)[source]¶ Yields a context manager to encapsulate the scope of a profiled action.
Example:
with self.profile('load training data'): # load training data code
The profiler will start once you’ve entered the context and will automatically stop once you exit the code block.
- Return type
-
setup
(stage=None, local_rank=None, log_dir=None)[source]¶ Execute arbitrary pre-profiling set-up steps.
- Return type
-
-
class
pytorch_lightning.profiler.profilers.
PassThroughProfiler
(dirpath=None, filename=None, output_filename=None)[source]¶ Bases:
pytorch_lightning.profiler.profilers.BaseProfiler
This class should be used when you don’t want the (small) overhead of profiling. The Trainer uses this class by default.
-
class
pytorch_lightning.profiler.profilers.
SimpleProfiler
(dirpath=None, filename=None, extended=True, output_filename=None)[source]¶ Bases:
pytorch_lightning.profiler.profilers.BaseProfiler
This profiler simply records the duration of actions (in seconds) and reports the mean duration of each action and the total time spent over the entire training run.
- Parameters
dirpath¶ (
Union
[str
,Path
,None
]) – Directory path for thefilename
. Ifdirpath
isNone
butfilename
is present, thetrainer.log_dir
(fromTensorBoardLogger
) will be used.filename¶ (
Optional
[str
]) – If present, filename where the profiler results will be saved instead of printing to stdout. The.txt
extension will be used automatically.
- Raises
ValueError – If you attempt to start an action which has already started, or if you attempt to stop recording an action which was never started.