DDP2Plugin¶
-
class
pytorch_lightning.plugins.training_type.
DDP2Plugin
(parallel_devices=None, num_nodes=1, cluster_environment=None, sync_batchnorm=False, ddp_comm_state=None, ddp_comm_hook=None, ddp_comm_wrapper=None, **kwargs)[source]¶ Bases:
pytorch_lightning.plugins.training_type.ddp.DDPPlugin
DDP2 behaves like DP in one node, but synchronization across nodes behaves like in DDP.
-
reduce
(tensor, *args, **kwargs)[source]¶ Reduces a tensor from all processes to one aggregated tensor. In DDP2, the reduction here is only across local devices within the node.
-
property
root_device
¶ Returns the root device
-