torch.optim.Optimizer.register_state_dict_pre_hook#
- Optimizer.register_state_dict_pre_hook(hook, prepend=False)[source]#
 Register a state dict pre-hook which will be called before
state_dict()is called.It should have the following signature:
hook(optimizer) -> None
The
optimizerargument is the optimizer instance being used. The hook will be called with argumentselfbefore callingstate_dictonself. The registered hook can be used to perform pre-processing before thestate_dictcall is made.- Parameters
 hook (Callable) – The user defined hook to be registered.
prepend (bool) – If True, the provided pre
hookwill be fired before all the already registered pre-hooks onstate_dict. Otherwise, the providedhookwill be fired after all the already registered pre-hooks. (default: False)
- Returns
 a handle that can be used to remove the added hook by calling
handle.remove()- Return type
 torch.utils.hooks.RemoveableHandle