torch.optim.Optimizer.register_state_dict_pre_hook¶
- Optimizer.register_state_dict_pre_hook(hook, prepend=False)[source][source]¶
- Register a state dict pre-hook which will be called before - state_dict()is called.- It should have the following signature: - hook(optimizer) -> None - The - optimizerargument is the optimizer instance being used. The hook will be called with argument- selfbefore calling- state_dicton- self. The registered hook can be used to perform pre-processing before the- state_dictcall is made.- Parameters
- hook (Callable) – The user defined hook to be registered. 
- prepend (bool) – If True, the provided pre - hookwill be fired before all the already registered pre-hooks on- state_dict. Otherwise, the provided- hookwill be fired after all the already registered pre-hooks. (default: False)
 
- Returns
- a handle that can be used to remove the added hook by calling - handle.remove()
- Return type
- torch.utils.hooks.RemoveableHandle