torch.nn.functional.mish# torch.nn.functional.mish(input, inplace=False)[source]# Apply the Mish function, element-wise. Mish: A Self Regularized Non-Monotonic Neural Activation Function. Mish(x)=x∗Tanh(Softplus(x)) ext{Mish}(x) = x * ext{Tanh}( ext{Softplus}(x)) Mish(x)=x∗Tanh(Softplus(x)) Note See Mish: A Self Regularized Non-Monotonic Neural Activation Function See Mish for more details. Return type Tensor