Rate this Page

torch.nn.functional.softplus#

torch.nn.functional.softplus(input, beta=1, threshold=20) Tensor#

Applies element-wise, the function Softplus(x)=1βlog(1+exp(βx))\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x)).

For numerical stability the implementation reverts to the linear function when input×β>thresholdinput \times \beta > threshold.

See Softplus for more details.