Rate this Page

AttentionBackend#

class torchao.prototype.attention.AttentionBackend(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source][source]#

Backend kernel for computing attention.