:github_url: https://github.com/pytorch/pytorch .. _exhale_function_namespaceat_1_1symint_1ae329e6e9ecacb635432e6d81ddc419ee: Template Function at::symint::_scaled_dot_product_cudnn_attention_backward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, c10::SymInt, c10::SymInt, double, bool, ::std::optional) ============================================================================================================================================================================================================================================================================================================================================================ - Defined in :ref:`file_build_aten_src_ATen_Functions.h` Function Documentation ---------------------- .. doxygenfunction:: at::symint::_scaled_dot_product_cudnn_attention_backward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, c10::SymInt, c10::SymInt, double, bool, ::std::optional)