celu# class torch.ao.nn.quantized.functional.celu(input, scale, zero_point, alpha=1.)[source]# Applies the quantized CELU function element-wise. CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x / \alpha) - 1)) CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1)) Parameters input (Tensor) – quantized input alpha (float) – the α\alphaα value for the CELU formulation. Default: 1.0 Return type Tensor
celu# class torch.ao.nn.quantized.functional.celu(input, scale, zero_point, alpha=1.)[source]# Applies the quantized CELU function element-wise. CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x / \alpha) - 1)) CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1)) Parameters input (Tensor) – quantized input alpha (float) – the α\alphaα value for the CELU formulation. Default: 1.0 Return type Tensor