Shortcuts

memory

Functions

garbage_collection_cuda

Garbage collection Torch (CUDA) memory.

get_gpu_memory_map

Deprecated since version v1.5.

get_memory_profile

Deprecated since version v1.5.

get_model_size_mb

Calculates the size of a Module in megabytes.

is_cuda_out_of_memory

rtype

bool

is_cudnn_snafu

rtype

bool

is_oom_error

rtype

bool

is_out_of_cpu_memory

rtype

bool

recursive_detach

Detach all tensors in in_dict.

Utilities related to memory.

pytorch_lightning.utilities.memory.garbage_collection_cuda()[source]

Garbage collection Torch (CUDA) memory.

Return type

None

pytorch_lightning.utilities.memory.get_gpu_memory_map()[source]

Deprecated since version v1.5: This function was deprecated in v1.5 in favor of pytorch_lightning.accelerators.gpu._get_nvidia_gpu_stats and will be removed in v1.7.

Get the current gpu usage.

Return type

Dict[str, float]

Returns

A dictionary in which the keys are device ids as integers and values are memory usage as integers in MB.

Raises

FileNotFoundError – If nvidia-smi installation not found

pytorch_lightning.utilities.memory.get_memory_profile(mode)[source]

Deprecated since version v1.5: This function was deprecated in v1.5 in favor of pytorch_lightning.accelerators.gpu._get_nvidia_gpu_stats and will be removed in v1.7.

Get a profile of the current memory usage.

Parameters

mode (str) –

There are two modes:

  • ’all’ means return memory for all gpus

  • ’min_max’ means return memory for max and min

Return type

Dict[str, float]

Returns

A dictionary in which the keys are device ids as integers and values are memory usage as integers in MB. If mode is ‘min_max’, the dictionary will also contain two additional keys:

  • ’min_gpu_mem’: the minimum memory usage in MB

  • ’max_gpu_mem’: the maximum memory usage in MB

pytorch_lightning.utilities.memory.get_model_size_mb(model)[source]

Calculates the size of a Module in megabytes.

The computation includes everything in the state_dict(), i.e., by default the parameters and buffers.

Return type

float

Returns

Number of megabytes in the parameters of the input module.

pytorch_lightning.utilities.memory.recursive_detach(in_dict, to_cpu=False)[source]

Detach all tensors in in_dict.

May operate recursively if some of the values in in_dict are dictionaries which contain instances of torch.Tensor. Other types in in_dict are not affected by this utility function.

Parameters
  • in_dict (Any) – Dictionary with tensors to detach

  • to_cpu (bool) – Whether to move tensor to cpu

Returns

Dictionary with detached tensors

Return type

out_dict

Read the Docs v: latest
Versions
latest
stable
1.5.9
1.5.8
1.5.7
1.5.6
1.5.5
1.5.4
1.5.3
1.5.2
1.5.1
1.5.0
1.4.9
1.4.8
1.4.7
1.4.6
1.4.5
1.4.4
1.4.3
1.4.2
1.4.1
1.4.0
1.3.8
1.3.7
1.3.6
1.3.5
1.3.4
1.3.3
1.3.2
1.3.1
1.3.0
1.2.10
1.2.8
1.2.7
1.2.6
1.2.5
1.2.4
1.2.3
1.2.2
1.2.1
1.2.0
1.1.8
1.1.7
1.1.6
1.1.5
1.1.4
1.1.3
1.1.2
1.1.1
1.1.0
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
0.10.0
0.9.0
0.8.5
0.8.4
0.8.3
0.8.2
0.8.1
0.8.0
0.7.6
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.3
0.4.9
ipynb-update
docs-search
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.