Shortcuts

MixedPrecisionPlugin

class lightning.pytorch.plugins.precision.MixedPrecisionPlugin(precision, device, scaler=None)[소스]

기반 클래스: lightning.pytorch.plugins.precision.precision_plugin.PrecisionPlugin

Plugin for Automatic Mixed Precision (AMP) training with torch.autocast.

매개변수
clip_gradients(optimizer, clip_val=0.0, gradient_clip_algorithm=GradClipAlgorithmType.NORM)[소스]

Clips the gradients.

반환 형식

None

forward_context()[소스]

Enable autocast context.

반환 형식

Generator[None, None, None]

load_state_dict(state_dict)[소스]

Called when loading a checkpoint, implement to reload precision plugin state given precision plugin state_dict.

매개변수

state_dict (Dict[str, Any]) – the precision plugin state returned by state_dict.

반환 형식

None

optimizer_step(optimizer, model, closure, **kwargs)[소스]

Hook to run the optimizer step.

반환 형식

Any

pre_backward(tensor, module)[소스]

Runs before precision plugin executes backward.

매개변수
  • tensor (Tensor) – The tensor that will be used for backpropagation

  • module (LightningModule) – The module that was involved in producing the tensor and whose parameters need the gradients

반환 형식

Tensor

state_dict()[소스]

Called when saving a checkpoint, implement to generate precision plugin state_dict.

반환 형식

Dict[str, Any]

반환

A dictionary containing precision plugin state.