:github_url: https://github.com/pytorch/pytorch .. _exhale_function_namespaceat_1aec3f0f65144e63182979d6ba5fc820f7: Function at::_scaled_dot_product_fused_attention_overrideable_backward_symint ============================================================================= - Defined in :ref:`file_build_aten_src_ATen_Functions.h` Function Documentation ---------------------- .. doxygenfunction:: at::_scaled_dot_product_fused_attention_overrideable_backward_symint(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, ::std::array, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, c10::SymInt, c10::SymInt, double, bool, const at::Tensor&, const at::Tensor&, ::std::optional)