Rate this Page

Template Class OptimizerCloneableOptions#

Nested Relationships#

Nested Types#

Inheritance Relationships#

Base Type#

Class Documentation#

template<typename Derived>
class OptimizerCloneableOptions : public torch::optim::OptimizerOptions#

OptimizerCloneableOptions provides parameter group inheritance functionality for PyTorch C++ optimizer options.

When creating parameter groups with partial options (e.g., AdamOptions().weight_decay(0.1)), fields not explicitly set by the user inherit from the optimizer’s default values, while explicitly set fields are preserved.

This enables Python-like behavior in C++:

// Python equivalent:
// optimizer = Adam([{'params': params1, 'weight_decay': 0.1}], lr=0.01)
// Result: weight_decay=0.1 preserved, lr=0.01 inherited

AdamOptions defaults;
defaults.lr(0.01).weight_decay(0.05);

std::vector<OptimizerParamGroup> groups;
groups.emplace_back(params1, std::make_unique<AdamOptions>(
    AdamOptions().weight_decay(0.1)));  // Only weight_decay specified

Adam optimizer(groups, defaults);
// Result: group inherits lr=0.01, preserves weight_decay=0.1

Implementation: Uses SFINAE-based field detection and constructor-default comparison to distinguish explicitly set fields from default values. Fields that match constructor defaults are inherited; others are preserved.