Shortcuts

validate_missing_and_unexpected_for_lora

torchtune.modules.peft.validate_missing_and_unexpected_for_lora(lora_attn_modules: Optional[list[Literal['q_proj', 'k_proj', 'v_proj', 'output_proj']]] = None, apply_lora_to_mlp: Optional[bool] = None, apply_lora_to_output: Optional[bool] = None, state_dict_keys: Optional[list[str]] = None, base_missing: Optional[list[str]] = None, base_unexpected: Optional[list[str]] = None, lora_missing: Optional[list[str]] = None, lora_unexpected: Optional[list[str]] = None) None[source]

This function checks that LoRA and/or base model weights are loaded into the full model correctly. via set comparison of the missing kets. This function relies only on the values of missing and unexpected as returned by the load_state_dict API with strict=False.

Parameters:
  • lora_attn_modules (Optional[list[LORA_ATTN_MODULES]]) – list of which linear layers LoRA should be applied to in each self-attention block. Options are {"q_proj", "k_proj", "v_proj", "output_proj"}. DEPRECATED: use state_dict_keys instead.

  • apply_lora_to_mlp (Optional[bool]) – whether LoRA is applied to each MLP linear. DEPRECATED: use state_dict_keys instead.

  • apply_lora_to_output (Optional[bool]) – whether LoRA is applied to the final output projection. DEPRECATED: use state_dict_keys instead.

  • state_dict_keys (Optional[list[str]]) – ground truth model state dict we are validating against

  • base_missing (Optional[list[str]]) – list of missing keys when loading base model weights. Default: None

  • base_unexpected (Optional[list[str]]) – list of unexpected keys when loading base model weights. Default: None

  • lora_missing (Optional[list[str]]) – list of missing keys when loading LoRA weights. Default: None

  • lora_unexpected (Optional[list[str]]) – list of unexpected keys when loading LoRA weights. Default: None

Returns:

None

Raises:

RuntimeError – If base_missing contains any base model keys, or if base_unexpected is nonempty, or if lora_missing contains any LoRA keys, or if lora_unexpected is nonempty.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources