Shortcuts

CheckpointHooks

class lightning.pytorch.core.hooks.CheckpointHooks[소스]

기반 클래스: object

Hooks to be used with Checkpointing.

on_load_checkpoint(checkpoint)[소스]

Called by Lightning to restore your model. If you saved something with on_save_checkpoint() this is your chance to restore this.

매개변수

checkpoint (Dict[str, Any]) – Loaded checkpoint

Example:

def on_load_checkpoint(self, checkpoint):
    # 99% of the time you don't need to implement this method
    self.something_cool_i_want_to_save = checkpoint['something_cool_i_want_to_save']

참고

Lightning auto-restores global step, epoch, and train state including amp scaling. There is no need for you to restore anything regarding training.

반환 형식

None

on_save_checkpoint(checkpoint)[소스]

Called by Lightning when saving a checkpoint to give you a chance to store anything else you might want to save.

매개변수

checkpoint (Dict[str, Any]) – The full checkpoint dictionary before it gets dumped to a file. Implementations of this hook can insert additional data into this dictionary.

Example:

def on_save_checkpoint(self, checkpoint):
    # 99% of use cases you don't need to implement this method
    checkpoint['something_cool_i_want_to_save'] = my_cool_pickable_object

참고

Lightning saves all aspects of training (epoch, global step, etc…) including amp scaling. There is no need for you to store anything about training.

반환 형식

None