Shortcuts

TorchSyncBatchNorm

class lightning.pytorch.plugins.TorchSyncBatchNorm[소스]

기반 클래스: lightning.pytorch.plugins.layer_sync.LayerSync

A plugin that wraps all batch normalization layers of a model with synchronization logic for multiprocessing.

This plugin has no effect in single-device operation.

apply(model)[소스]

Add global batchnorm for a model spread across multiple GPUs and nodes.

Override this method to synchronize batchnorm layers between specific process groups instead of the whole world.

매개변수

model (Module) – Reference to the current LightningModule

반환 형식

Module

반환

LightningModule with batchnorm layers synchronized within the process groups.

revert(model)[소스]

Convert the wrapped batchnorm layers back to regular batchnorm layers.

매개변수

model (Module) – Reference to the current LightningModule

반환 형식

Module

반환

LightningModule with regular batchnorm layers that will no longer sync across processes.