Normalization Layers#
Normalization layers stabilize and accelerate training by normalizing intermediate activations. They help with gradient flow and allow higher learning rates.
BatchNorm: Normalizes across batch dimension; most common in CNNs
InstanceNorm: Normalizes each sample independently; popular in style transfer
LayerNorm: Normalizes across feature dimension; standard in transformers
GroupNorm: Normalizes within groups of channels; works with small batches
LocalResponseNorm: Lateral inhibition inspired by neuroscience (less common today)
BatchNorm1d / BatchNorm2d / BatchNorm3d#
-
class BatchNorm1d : public torch::nn::ModuleHolder<BatchNorm1dImpl>#
A
ModuleHoldersubclass forBatchNorm1dImpl.See the documentation for
BatchNorm1dImplclass to learn what methods it provides, and examples of how to useBatchNorm1dwithtorch::nn::BatchNorm1dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = BatchNorm1dImpl#
-
using Impl = BatchNorm1dImpl#
-
class BatchNorm1dImpl : public torch::nn::BatchNormImplBase<1, BatchNorm1dImpl>#
Applies the BatchNorm1d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.BatchNorm1d to learn about the exact behavior of this module.
See the documentation for
torch::nn::BatchNorm1dOptionsclass to learn what constructor arguments are supported for this module.Example:
BatchNorm1d model(BatchNorm1dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
-
class BatchNorm2d : public torch::nn::ModuleHolder<BatchNorm2dImpl>#
A
ModuleHoldersubclass forBatchNorm2dImpl.See the documentation for
BatchNorm2dImplclass to learn what methods it provides, and examples of how to useBatchNorm2dwithtorch::nn::BatchNorm2dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = BatchNorm2dImpl#
-
using Impl = BatchNorm2dImpl#
-
class BatchNorm2dImpl : public torch::nn::BatchNormImplBase<2, BatchNorm2dImpl>#
Applies the BatchNorm2d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.BatchNorm2d to learn about the exact behavior of this module.
See the documentation for
torch::nn::BatchNorm2dOptionsclass to learn what constructor arguments are supported for this module.Example:
BatchNorm2d model(BatchNorm2dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
-
class BatchNorm3d : public torch::nn::ModuleHolder<BatchNorm3dImpl>#
A
ModuleHoldersubclass forBatchNorm3dImpl.See the documentation for
BatchNorm3dImplclass to learn what methods it provides, and examples of how to useBatchNorm3dwithtorch::nn::BatchNorm3dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = BatchNorm3dImpl#
-
using Impl = BatchNorm3dImpl#
-
class BatchNorm3dImpl : public torch::nn::BatchNormImplBase<3, BatchNorm3dImpl>#
Applies the BatchNorm3d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.BatchNorm3d to learn about the exact behavior of this module.
See the documentation for
torch::nn::BatchNorm3dOptionsclass to learn what constructor arguments are supported for this module.Example:
BatchNorm3d model(BatchNorm3dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
Example:
auto bn = torch::nn::BatchNorm2d(
torch::nn::BatchNorm2dOptions(64) // num_features
.eps(1e-5)
.momentum(0.1)
.affine(true)
.track_running_stats(true));
InstanceNorm1d / InstanceNorm2d / InstanceNorm3d#
-
class InstanceNorm1d : public torch::nn::ModuleHolder<InstanceNorm1dImpl>#
A
ModuleHoldersubclass forInstanceNorm1dImpl.See the documentation for
InstanceNorm1dImplclass to learn what methods it provides, and examples of how to useInstanceNorm1dwithtorch::nn::InstanceNorm1dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = InstanceNorm1dImpl#
-
using Impl = InstanceNorm1dImpl#
-
class InstanceNorm1dImpl : public torch::nn::InstanceNormImpl<1, InstanceNorm1dImpl>#
Applies the InstanceNorm1d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.InstanceNorm1d to learn about the exact behavior of this module.
See the documentation for
torch::nn::InstanceNorm1dOptionsclass to learn what constructor arguments are supported for this module.Example:
InstanceNorm1d model(InstanceNorm1dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
-
class InstanceNorm2d : public torch::nn::ModuleHolder<InstanceNorm2dImpl>#
A
ModuleHoldersubclass forInstanceNorm2dImpl.See the documentation for
InstanceNorm2dImplclass to learn what methods it provides, and examples of how to useInstanceNorm2dwithtorch::nn::InstanceNorm2dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = InstanceNorm2dImpl#
-
using Impl = InstanceNorm2dImpl#
-
class InstanceNorm2dImpl : public torch::nn::InstanceNormImpl<2, InstanceNorm2dImpl>#
Applies the InstanceNorm2d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.InstanceNorm2d to learn about the exact behavior of this module.
See the documentation for
torch::nn::InstanceNorm2dOptionsclass to learn what constructor arguments are supported for this module.Example:
InstanceNorm2d model(InstanceNorm2dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
-
class InstanceNorm3d : public torch::nn::ModuleHolder<InstanceNorm3dImpl>#
A
ModuleHoldersubclass forInstanceNorm3dImpl.See the documentation for
InstanceNorm3dImplclass to learn what methods it provides, and examples of how to useInstanceNorm3dwithtorch::nn::InstanceNorm3dOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = InstanceNorm3dImpl#
-
using Impl = InstanceNorm3dImpl#
-
class InstanceNorm3dImpl : public torch::nn::InstanceNormImpl<3, InstanceNorm3dImpl>#
Applies the InstanceNorm3d function.
See https://pytorch.org/docs/main/nn.html#torch.nn.InstanceNorm3d to learn about the exact behavior of this module.
See the documentation for
torch::nn::InstanceNorm3dOptionsclass to learn what constructor arguments are supported for this module.Example:
InstanceNorm3d model(InstanceNorm3dOptions(4).eps(0.5).momentum(0.1).affine(false).track_running_stats(true));
LayerNorm#
-
class LayerNorm : public torch::nn::ModuleHolder<LayerNormImpl>#
A
ModuleHoldersubclass forLayerNormImpl.See the documentation for
LayerNormImplclass to learn what methods it provides, and examples of how to useLayerNormwithtorch::nn::LayerNormOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = LayerNormImpl#
-
using Impl = LayerNormImpl#
-
class LayerNormImpl : public torch::nn::Cloneable<LayerNormImpl>#
Applies Layer Normalization over a mini-batch of inputs as described in the paper
Layer Normalization_ .See https://pytorch.org/docs/main/nn.html#torch.nn.LayerNorm to learn about the exact behavior of this module.
See the documentation for
torch::nn::LayerNormOptionsclass to learn what constructor arguments are supported for this module.Example:
LayerNorm model(LayerNormOptions({2, 2}).elementwise_affine(false).eps(2e-5));
Public Functions
-
inline LayerNormImpl(std::vector<int64_t> normalized_shape)#
-
explicit LayerNormImpl(LayerNormOptions options_)#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
void reset_parameters()#
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
LayerNormmodule into the givenstream.
-
Tensor forward(const Tensor &input)#
Applies layer normalization over a mini-batch of inputs as described in the paper
Layer Normalization_ .The mean and standard-deviation are calculated separately over the last certain number dimensions which have to be of the shape specified by input
normalized_shape.Layer Normalization: https://arxiv.org/abs/1607.06450
-
inline LayerNormImpl(std::vector<int64_t> normalized_shape)#
Example:
auto ln = torch::nn::LayerNorm(
torch::nn::LayerNormOptions({768})); // normalized_shape
GroupNorm#
-
class GroupNorm : public torch::nn::ModuleHolder<GroupNormImpl>#
A
ModuleHoldersubclass forGroupNormImpl.See the documentation for
GroupNormImplclass to learn what methods it provides, and examples of how to useGroupNormwithtorch::nn::GroupNormOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = GroupNormImpl#
-
using Impl = GroupNormImpl#
-
class GroupNormImpl : public torch::nn::Cloneable<GroupNormImpl>#
Applies Group Normalization over a mini-batch of inputs as described in the paper
Group Normalization_ .See https://pytorch.org/docs/main/nn.html#torch.nn.GroupNorm to learn about the exact behavior of this module.
See the documentation for
torch::nn::GroupNormOptionsclass to learn what constructor arguments are supported for this module.Example:
GroupNorm model(GroupNormOptions(2, 2).eps(2e-5).affine(false));
Public Functions
-
inline GroupNormImpl(int64_t num_groups, int64_t num_channels)#
-
explicit GroupNormImpl(const GroupNormOptions &options_)#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
void reset_parameters()#
-
inline GroupNormImpl(int64_t num_groups, int64_t num_channels)#
Example:
auto gn = torch::nn::GroupNorm(
torch::nn::GroupNormOptions(32, 256)); // num_groups, num_channels
LocalResponseNorm#
-
class LocalResponseNorm : public torch::nn::ModuleHolder<LocalResponseNormImpl>#
A
ModuleHoldersubclass forLocalResponseNormImpl.See the documentation for
LocalResponseNormImplclass to learn what methods it provides, and examples of how to useLocalResponseNormwithtorch::nn::LocalResponseNormOptions. See the documentation forModuleHolderto learn about PyTorch’s module storage semantics.Public Types
-
using Impl = LocalResponseNormImpl#
-
using Impl = LocalResponseNormImpl#
-
class LocalResponseNormImpl : public torch::nn::Cloneable<LocalResponseNormImpl>#
Applies local response normalization over an input signal composed of several input planes, where channels occupy the second dimension.
Applies normalization across channels. See https://pytorch.org/docs/main/nn.html#torch.nn.LocalResponseNorm to learn about the exact behavior of this module.
See the documentation for
torch::nn::LocalResponseNormOptionsclass to learn what constructor arguments are supported for this module.Example:
LocalResponseNorm model(LocalResponseNormOptions(2).alpha(0.0002).beta(0.85).k(2.));
Public Functions
-
inline LocalResponseNormImpl(int64_t size)#
-
explicit LocalResponseNormImpl(const LocalResponseNormOptions &options_)#
-
virtual void reset() override#
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override#
Pretty prints the
LocalResponseNormImplmodule into the givenstream.
-
inline LocalResponseNormImpl(int64_t size)#