MixedPrecisionPlugin¶
- class lightning.pytorch.plugins.precision.MixedPrecisionPlugin(precision, device, scaler=None)[소스]¶
기반 클래스:
lightning.pytorch.plugins.precision.precision_plugin.PrecisionPluginPlugin for Automatic Mixed Precision (AMP) training with
torch.autocast.- 매개변수
precision¶ (
Literal[‘16-mixed’, ‘bf16-mixed’]) – Whether to usetorch.float16('16-mixed') ortorch.bfloat16('bf16-mixed').scaler¶ (
Optional[GradScaler]) – An optionaltorch.cuda.amp.GradScalerto use.
- clip_gradients(optimizer, clip_val=0.0, gradient_clip_algorithm=GradClipAlgorithmType.NORM)[소스]¶
Clips the gradients.
- 반환 형식
- load_state_dict(state_dict)[소스]¶
Called when loading a checkpoint, implement to reload precision plugin state given precision plugin state_dict.