:github_url: https://github.com/pytorch/pytorch .. _exhale_function_namespaceat_1_1symint_1af248686381153eef83bd8cb77de47331: Template Function at::symint::_flash_attention_backward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, int64_t, int64_t, double, bool, const at::Tensor&, const at::Tensor&, ::std::optional, ::std::optional, ::std::optional) ================================================================================================================================================================================================================================================================================================================================================================== - Defined in :ref:`file_build_aten_src_ATen_Functions.h` Function Documentation ---------------------- .. doxygenfunction:: at::symint::_flash_attention_backward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, int64_t, int64_t, double, bool, const at::Tensor&, const at::Tensor&, ::std::optional, ::std::optional, ::std::optional) :project: PyTorch