Shortcuts

TPUAccelerator

class pytorch_lightning.accelerators.TPUAccelerator[source]

Bases: pytorch_lightning.accelerators.accelerator.Accelerator

Accelerator for TPU devices.

static auto_device_count()[source]

Get the devices when set to auto.

Return type

int

get_device_stats(device)[source]

Gets stats for the given TPU device.

Parameters

device (Union[str, device]) – TPU device for which to get stats

Return type

Dict[str, Any]

Returns

A dictionary mapping the metrics (free memory and peak memory) to their values.

static get_parallel_devices(devices)[source]

Gets parallel devices for the Accelerator.

Return type

List[int]

static is_available()[source]

Detect if the hardware is available.

Return type

bool

static parse_devices(devices)[source]

Accelerator device parsing logic.

Return type

Union[List[int], int, None]

Read the Docs v: stable
Versions
latest
stable
1.6.3
1.6.2
1.6.1
1.6.0
1.5.10
1.5.9
1.5.8
1.5.7
1.5.6
1.5.5
1.5.4
1.5.3
1.5.2
1.5.1
1.5.0
1.4.9
1.4.8
1.4.7
1.4.6
1.4.5
1.4.4
1.4.3
1.4.2
1.4.1
1.4.0
1.3.8
1.3.7
1.3.6
1.3.5
1.3.4
1.3.3
1.3.2
1.3.1
1.3.0
1.2.10
1.2.8
1.2.7
1.2.6
1.2.5
1.2.4
1.2.3
1.2.2
1.2.1
1.2.0
1.1.8
1.1.7
1.1.6
1.1.5
1.1.4
1.1.3
1.1.2
1.1.1
1.1.0
1.0.8
1.0.7
1.0.6
1.0.5
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
0.10.0
0.9.0
0.8.5
0.8.4
0.8.3
0.8.2
0.8.1
0.8.0
0.7.6
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.3
0.4.9
docs_2
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.