Activation Functions#
Activation functions introduce non-linearity into neural networks, allowing them to learn complex patterns. Without activations, stacked linear layers would collapse into a single linear transformation.
Common choices:
ReLU family (ReLU, LeakyReLU, PReLU, RReLU): Fast, widely used, good default choice
ELU family (ELU, SELU, CELU): Smoother than ReLU, can produce negative outputs
GELU/SiLU/Mish: Modern activations popular in transformers and advanced architectures
Sigmoid/Tanh: Classic activations, useful for output layers (probabilities, bounded outputs)
Softmax: Converts logits to probability distribution (classification output)
ReLU#
-
class ReLU : public torch::nn::ModuleHolder<ReLUImpl>#
A
ModuleHoldersubclass forReLUImpl.See the documentation for
ReLUImplclass to learn what methods it provides, and examples of how to useReLUwithtorch::nn::ReLUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class ReLUImpl : public torch::nn::Cloneable<ReLUImpl>#
Applies the ReLU function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.ReLU to learn about the exact behavior of this module.
See the documentation for
torch::nn::ReLUOptionsclass to learn what constructor arguments are supported for this module.Example:
ReLU model(ReLUOptions().inplace(true));
Example:
auto relu = torch::nn::ReLU(torch::nn::ReLUOptions().inplace(true));
LeakyReLU#
-
class LeakyReLU : public torch::nn::ModuleHolder<LeakyReLUImpl>#
A
ModuleHoldersubclass forLeakyReLUImpl.See the documentation for
LeakyReLUImplclass to learn what methods it provides, and examples of how to useLeakyReLUwithtorch::nn::LeakyReLUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = LeakyReLUImpl#
-
using Impl = LeakyReLUImpl#
-
class LeakyReLUImpl : public torch::nn::Cloneable<LeakyReLUImpl>#
Applies the LeakyReLU function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.LeakyReLU to learn about the exact behavior of this module.
See the documentation for
torch::nn::LeakyReLUOptionsclass to learn what constructor arguments are supported for this module.Example:
LeakyReLU model(LeakyReLUOptions().negative_slope(0.42).inplace(true));
PReLU#
-
class PReLU : public torch::nn::ModuleHolder<PReLUImpl>#
A
ModuleHoldersubclass forPReLUImpl.See the documentation for
PReLUImplclass to learn what methods it provides, and examples of how to usePReLUwithtorch::nn::PReLUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class PReLUImpl : public torch::nn::Cloneable<PReLUImpl>#
Applies the PReLU function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.PReLU to learn about the exact behavior of this module.
See the documentation for
torch::nn::PReLUOptionsclass to learn what constructor arguments are supported for this module.Example:
PReLU model(PReLUOptions().num_parameters(42));
RReLU#
-
class RReLU : public torch::nn::ModuleHolder<RReLUImpl>#
A
ModuleHoldersubclass forRReLUImpl.See the documentation for
RReLUImplclass to learn what methods it provides, and examples of how to useRReLUwithtorch::nn::RReLUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class RReLUImpl : public torch::nn::Cloneable<RReLUImpl>#
Applies the RReLU function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.RReLU to learn about the exact behavior of this module.
See the documentation for
torch::nn::RReLUOptionsclass to learn what constructor arguments are supported for this module.Example:
RReLU model(RReLUOptions().lower(0.24).upper(0.42).inplace(true));
ReLU6#
Like ReLU but caps the output at 6: min(max(0, x), 6). Commonly used in
mobile architectures (MobileNet).
-
class ReLU6 : public torch::nn::ModuleHolder<ReLU6Impl>#
A
ModuleHoldersubclass forReLU6Impl.See the documentation for
ReLU6Implclass to learn what methods it provides, and examples of how to useReLU6withtorch::nn::ReLU6Options. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class ReLU6Impl : public torch::nn::Cloneable<ReLU6Impl>#
Applies the ReLU6 function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.ReLU6 to learn about the exact behavior of this module.
See the documentation for
torch::nn::ReLU6Optionsclass to learn what constructor arguments are supported for this module.Example:
ReLU6 model(ReLU6Options().inplace(true));
GLU#
Gated Linear Unit. Splits the input tensor in half along a dimension,
then applies a * sigmoid(b).
-
class GLU : public torch::nn::ModuleHolder<GLUImpl>#
A
ModuleHoldersubclass forGLUImpl.See the documentation for
GLUImplclass to learn what methods it provides, and examples of how to useGLUwithtorch::nn::GLUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class GLUImpl : public torch::nn::Cloneable<GLUImpl>#
Applies glu over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.GLU to learn about the exact behavior of this module.
See the documentation for
torch::nn::GLUOptionsclass to learn what constructor arguments are supported for this module.Example:
GLU model(GLUOptions(1));
LogSigmoid#
Applies element-wise log(sigmoid(x)). Numerically more stable than
computing log and sigmoid separately.
-
class LogSigmoid : public torch::nn::ModuleHolder<LogSigmoidImpl>#
A
ModuleHoldersubclass forLogSigmoidImpl.See the documentation for
LogSigmoidImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = LogSigmoidImpl#
-
using Impl = LogSigmoidImpl#
-
class LogSigmoidImpl : public torch::nn::Cloneable<LogSigmoidImpl>#
Applies the LogSigmoid function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.LogSigmoid to learn about the exact behavior of this module.
Public Functions
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
LogSigmoidmodule into the givenstream.
-
virtual void reset() override#
ELU#
-
class ELU : public torch::nn::ModuleHolder<ELUImpl>#
A
ModuleHoldersubclass forELUImpl.See the documentation for
ELUImplclass to learn what methods it provides, and examples of how to useELUwithtorch::nn::ELUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class ELUImpl : public torch::nn::Cloneable<ELUImpl>#
Applies elu over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.ELU to learn about the exact behavior of this module.
See the documentation for
torch::nn::ELUOptionsclass to learn what constructor arguments are supported for this module.Example:
ELU model(ELUOptions().alpha(42.42).inplace(true));
SELU#
-
class SELU : public torch::nn::ModuleHolder<SELUImpl>#
A
ModuleHoldersubclass forSELUImpl.See the documentation for
SELUImplclass to learn what methods it provides, and examples of how to useSELUwithtorch::nn::SELUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class SELUImpl : public torch::nn::Cloneable<SELUImpl>#
Applies the selu function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.SELU to learn about the exact behavior of this module.
See the documentation for
torch::nn::SELUOptionsclass to learn what constructor arguments are supported for this module.Example:
SELU model(SELUOptions().inplace(true));
CELU#
-
class CELU : public torch::nn::ModuleHolder<CELUImpl>#
A
ModuleHoldersubclass forCELUImpl.See the documentation for
CELUImplclass to learn what methods it provides, and examples of how to useCELUwithtorch::nn::CELUOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class CELUImpl : public torch::nn::Cloneable<CELUImpl>#
Applies celu over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.CELU to learn about the exact behavior of this module.
See the documentation for
torch::nn::CELUOptionsclass to learn what constructor arguments are supported for this module.Example:
CELU model(CELUOptions().alpha(42.42).inplace(true));
GELU#
-
class GELU : public torch::nn::ModuleHolder<GELUImpl>#
A
ModuleHoldersubclass forGELUImpl.See the documentation for
GELUImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class GELUImpl : public torch::nn::Cloneable<GELUImpl>#
Applies gelu over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.GELU to learn about the exact behavior of this module.
SiLU (Swish)#
-
class SiLU : public torch::nn::ModuleHolder<SiLUImpl>#
A
ModuleHoldersubclass forSiLUImpl.See the documentation for
SiLUImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class SiLUImpl : public torch::nn::Cloneable<SiLUImpl>#
Applies silu over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.SiLU to learn about the exact behavior of this module.
Mish#
-
class Mish : public torch::nn::ModuleHolder<MishImpl>#
A
ModuleHoldersubclass forMishImpl.See the documentation for
MishImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class MishImpl : public torch::nn::Cloneable<MishImpl>#
Applies mish over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Mish to learn about the exact behavior of this module.
Sigmoid#
-
class Sigmoid : public torch::nn::ModuleHolder<SigmoidImpl>#
A
ModuleHoldersubclass forSigmoidImpl.See the documentation for
SigmoidImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SigmoidImpl#
-
using Impl = SigmoidImpl#
-
class SigmoidImpl : public torch::nn::Cloneable<SigmoidImpl>#
Applies sigmoid over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Sigmoid to learn about the exact behavior of this module.
Tanh#
-
class Tanh : public torch::nn::ModuleHolder<TanhImpl>#
A
ModuleHoldersubclass forTanhImpl.See the documentation for
TanhImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.
-
class TanhImpl : public torch::nn::Cloneable<TanhImpl>#
Applies Tanh over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Tanh to learn about the exact behavior of this module.
Softmax#
-
class Softmax : public torch::nn::ModuleHolder<SoftmaxImpl>#
A
ModuleHoldersubclass forSoftmaxImpl.See the documentation for
SoftmaxImplclass to learn what methods it provides, and examples of how to useSoftmaxwithtorch::nn::SoftmaxOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SoftmaxImpl#
-
using Impl = SoftmaxImpl#
-
class SoftmaxImpl : public torch::nn::Cloneable<SoftmaxImpl>#
Applies the Softmax function.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softmax to learn about the exact behavior of this module.
See the documentation for
torch::nn::SoftmaxOptionsclass to learn what constructor arguments are supported for this module.Example:
Softmax model(SoftmaxOptions(1));
Public Functions
-
inline explicit SoftmaxImpl(int64_t dim)#
-
explicit SoftmaxImpl(const SoftmaxOptions &options_)#
Public Members
-
SoftmaxOptions options#
-
inline explicit SoftmaxImpl(int64_t dim)#
Example:
auto softmax = torch::nn::Softmax(torch::nn::SoftmaxOptions(/*dim=*/1));
Softmax2d#
Applies Softmax over features to each spatial location in a 4D input
tensor of shape (N, C, H, W).
-
class Softmax2d : public torch::nn::ModuleHolder<Softmax2dImpl>#
A
ModuleHoldersubclass forSoftmax2dImpl.See the documentation for
Softmax2dImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = Softmax2dImpl#
-
using Impl = Softmax2dImpl#
-
class Softmax2dImpl : public torch::nn::Cloneable<Softmax2dImpl>#
Applies the Softmax2d function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softmax2d to learn about the exact behavior of this module.
LogSoftmax#
-
class LogSoftmax : public torch::nn::ModuleHolder<LogSoftmaxImpl>#
A
ModuleHoldersubclass forLogSoftmaxImpl.See the documentation for
LogSoftmaxImplclass to learn what methods it provides, and examples of how to useLogSoftmaxwithtorch::nn::LogSoftmaxOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = LogSoftmaxImpl#
-
using Impl = LogSoftmaxImpl#
-
class LogSoftmaxImpl : public torch::nn::Cloneable<LogSoftmaxImpl>#
Applies the LogSoftmax function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.LogSoftmax to learn about the exact behavior of this module.
See the documentation for
torch::nn::LogSoftmaxOptionsclass to learn what constructor arguments are supported for this module.Example:
LogSoftmax model(LogSoftmaxOptions(1));
Public Functions
-
inline explicit LogSoftmaxImpl(int64_t dim)#
-
explicit LogSoftmaxImpl(const LogSoftmaxOptions &options_)#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
LogSoftmaxmodule into the givenstream.
Public Members
-
LogSoftmaxOptions options#
-
inline explicit LogSoftmaxImpl(int64_t dim)#
Softmin#
-
class Softmin : public torch::nn::ModuleHolder<SoftminImpl>#
A
ModuleHoldersubclass forSoftminImpl.See the documentation for
SoftminImplclass to learn what methods it provides, and examples of how to useSoftminwithtorch::nn::SoftminOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SoftminImpl#
-
using Impl = SoftminImpl#
-
class SoftminImpl : public torch::nn::Cloneable<SoftminImpl>#
Applies the Softmin function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softmin to learn about the exact behavior of this module.
See the documentation for
torch::nn::SoftminOptionsclass to learn what constructor arguments are supported for this module.Example:
Softmin model(SoftminOptions(1));
Public Functions
-
inline explicit SoftminImpl(int64_t dim)#
-
explicit SoftminImpl(const SoftminOptions &options_)#
Public Members
-
SoftminOptions options#
-
inline explicit SoftminImpl(int64_t dim)#
Softplus#
-
class Softplus : public torch::nn::ModuleHolder<SoftplusImpl>#
A
ModuleHoldersubclass forSoftplusImpl.See the documentation for
SoftplusImplclass to learn what methods it provides, and examples of how to useSoftpluswithtorch::nn::SoftplusOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SoftplusImpl#
-
using Impl = SoftplusImpl#
-
class SoftplusImpl : public torch::nn::Cloneable<SoftplusImpl>#
Applies softplus over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softplus to learn about the exact behavior of this module.
See the documentation for
torch::nn::SoftplusOptionsclass to learn what constructor arguments are supported for this module.Example:
Softplus model(SoftplusOptions().beta(0.24).threshold(42.42));
Softshrink#
-
class Softshrink : public torch::nn::ModuleHolder<SoftshrinkImpl>#
A
ModuleHoldersubclass forSoftshrinkImpl.See the documentation for
SoftshrinkImplclass to learn what methods it provides, and examples of how to useSoftshrinkwithtorch::nn::SoftshrinkOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SoftshrinkImpl#
-
using Impl = SoftshrinkImpl#
-
class SoftshrinkImpl : public torch::nn::Cloneable<SoftshrinkImpl>#
Applies the soft shrinkage function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softshrink to learn about the exact behavior of this module.
See the documentation for
torch::nn::SoftshrinkOptionsclass to learn what constructor arguments are supported for this module.Example:
Softshrink model(SoftshrinkOptions(42.42));
Public Functions
-
explicit SoftshrinkImpl(const SoftshrinkOptions &options_ = {})#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
Softshrinkmodule into the givenstream.
-
explicit SoftshrinkImpl(const SoftshrinkOptions &options_ = {})#
Softsign#
-
class Softsign : public torch::nn::ModuleHolder<SoftsignImpl>#
A
ModuleHoldersubclass forSoftsignImpl.See the documentation for
SoftsignImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = SoftsignImpl#
-
using Impl = SoftsignImpl#
-
class SoftsignImpl : public torch::nn::Cloneable<SoftsignImpl>#
Applies Softsign over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Softsign to learn about the exact behavior of this module.
Hardshrink#
-
class Hardshrink : public torch::nn::ModuleHolder<HardshrinkImpl>#
A
ModuleHoldersubclass forHardshrinkImpl.See the documentation for
HardshrinkImplclass to learn what methods it provides, and examples of how to useHardshrinkwithtorch::nn::HardshrinkOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = HardshrinkImpl#
-
using Impl = HardshrinkImpl#
-
class HardshrinkImpl : public torch::nn::Cloneable<HardshrinkImpl>#
Applies the hard shrinkage function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Hardshrink to learn about the exact behavior of this module.
See the documentation for
torch::nn::HardshrinkOptionsclass to learn what constructor arguments are supported for this module.Example:
Hardshrink model(HardshrinkOptions().lambda(42.42));
Public Functions
-
explicit HardshrinkImpl(const HardshrinkOptions &options_ = {})#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
Hardshrinkmodule into the givenstream.
-
explicit HardshrinkImpl(const HardshrinkOptions &options_ = {})#
Hardtanh#
-
class Hardtanh : public torch::nn::ModuleHolder<HardtanhImpl>#
A
ModuleHoldersubclass forHardtanhImpl.See the documentation for
HardtanhImplclass to learn what methods it provides, and examples of how to useHardtanhwithtorch::nn::HardtanhOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = HardtanhImpl#
-
using Impl = HardtanhImpl#
-
class HardtanhImpl : public torch::nn::Cloneable<HardtanhImpl>#
Applies the HardTanh function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Hardtanh to learn about the exact behavior of this module.
See the documentation for
torch::nn::HardtanhOptionsclass to learn what constructor arguments are supported for this module.Example:
Hardtanh model(HardtanhOptions().min_val(-42.42).max_val(0.42).inplace(true));
Tanhshrink#
-
class Tanhshrink : public torch::nn::ModuleHolder<TanhshrinkImpl>#
A
ModuleHoldersubclass forTanhshrinkImpl.See the documentation for
TanhshrinkImplclass to learn what methods it provides, or the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = TanhshrinkImpl#
-
using Impl = TanhshrinkImpl#
-
class TanhshrinkImpl : public torch::nn::Cloneable<TanhshrinkImpl>#
Applies Tanhshrink over a given input.
See https://pytorch.org/docs/main/nn.html#torch.nn.Tanhshrink to learn about the exact behavior of this module.
Public Functions
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
Tanhshrinkmodule into the givenstream.
-
virtual void reset() override#
Threshold#
-
class Threshold : public torch::nn::ModuleHolder<ThresholdImpl>#
A
ModuleHoldersubclass forThresholdImpl.See the documentation for
ThresholdImplclass to learn what methods it provides, and examples of how to useThresholdwithtorch::nn::ThresholdOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = ThresholdImpl#
-
using Impl = ThresholdImpl#
-
class ThresholdImpl : public torch::nn::Cloneable<ThresholdImpl>#
Applies the Threshold function element-wise.
See https://pytorch.org/docs/main/nn.html#torch.nn.Threshold to learn about the exact behavior of this module.
See the documentation for
torch::nn::ThresholdOptionsclass to learn what constructor arguments are supported for this module.Example:
Threshold model(ThresholdOptions(42.42, 24.24).inplace(true));