Rate this Page

torch.nn.functional.prelu#

torch.nn.functional.prelu(input, weight) Tensor#

Applies element-wise the function PReLU(x)=max(0,x)+weightmin(0,x)\text{PReLU}(x) = \max(0,x) + \text{weight} * \min(0,x) where weight is a learnable parameter.

Note

weight is expected to be a scalar or 1-D tensor. If weight is 1-D, its size must match the number of input channels, determined by input.size(1) when input.dim() >= 2, otherwise 1. In the 1-D case, note that when input has dim > 2, weight can be expanded to the shape of input in a way that is not possible using normal broadcasting semantics.

See PReLU for more details.