Shortcuts

torchrl.trainers.algorithms.configs.utils.AdamWConfig

class torchrl.trainers.algorithms.configs.utils.AdamWConfig(lr: float = 0.001, betas: tuple[float, float] = (0.9, 0.999), eps: float = 1e-08, weight_decay: float = 0.01, amsgrad: bool = False, maximize: bool = False, foreach: bool | None = None, capturable: bool = False, differentiable: bool = False, fused: bool | None = None, _target_: str = 'torch.optim.AdamW', _partial_: bool = True)[source]

Configuration for AdamW optimizer.

Docs

Lorem ipsum dolor sit amet, consectetur

View Docs

Tutorials

Lorem ipsum dolor sit amet, consectetur

View Tutorials

Resources

Lorem ipsum dolor sit amet, consectetur

View Resources