torch.optim.Optimizer.register_load_state_dict_pre_hook¶
- Optimizer.register_load_state_dict_pre_hook(hook, prepend=False)[source][source]¶
- Register a load_state_dict pre-hook which will be called before - load_state_dict()is called. It should have the following signature:- hook(optimizer, state_dict) -> state_dict or None - The - optimizerargument is the optimizer instance being used and the- state_dictargument is a shallow copy of the- state_dictthe user passed in to- load_state_dict. The hook may modify the state_dict inplace or optionally return a new one. If a state_dict is returned, it will be used to be loaded into the optimizer.- The hook will be called with argument - selfand- state_dictbefore calling- load_state_dicton- self. The registered hook can be used to perform pre-processing before the- load_state_dictcall is made.- Parameters
- hook (Callable) – The user defined hook to be registered. 
- prepend (bool) – If True, the provided pre - hookwill be fired before all the already registered pre-hooks on- load_state_dict. Otherwise, the provided- hookwill be fired after all the already registered pre-hooks. (default: False)
 
- Returns
- a handle that can be used to remove the added hook by calling - handle.remove()
- Return type
- torch.utils.hooks.RemoveableHandle