Rate this Page

torch.nn.attention.activate_flash_attention_impl#

torch.nn.attention.activate_flash_attention_impl(impl)[source]#

Activate into the dispatcher a previously registered flash attention impl.

Note

Backend providers should NOT automatically activate their implementation on import. Users should explicitly opt-in by calling this function or via environment variables to ensure multiple provider libraries can coexist.

Parameters:

impl (str | Literal['FA4']) – Implementation identifier to activate. See list_flash_attention_impls() for available implementations. If the backend’s register_flash_attention_impl() callable returns a FlashAttentionHandle, the registry keeps that handle alive for the lifetime of the process (until explicit uninstall support exists).

Example

>>> activate_flash_attention_impl("FA4")