:github_url: https://github.com/pytorch/pytorch .. _exhale_function_namespaceat_1a2dfd1331b7eb2327f9a4af034d21c7a0: Function at::_scaled_dot_product_flash_attention_for_cpu ======================================================== - Defined in :ref:`file_build_aten_src_ATen_Functions.h` Function Documentation ---------------------- .. doxygenfunction:: at::_scaled_dot_product_flash_attention_for_cpu(const at::Tensor&, const at::Tensor&, const at::Tensor&, double, bool, const ::std::optional&, ::std::optional)