Shortcuts

torchrl.trainers.algorithms.configs.utils.AdamConfig

class torchrl.trainers.algorithms.configs.utils.AdamConfig(lr: float = 0.001, betas: tuple[float, float] = (0.9, 0.999), eps: float = 0.0001, weight_decay: float = 0.0, amsgrad: bool = False, _target_: str = 'torch.optim.Adam', _partial_: bool = True)[source]

Configuration for Adam optimizer.

Docs

Lorem ipsum dolor sit amet, consectetur

View Docs

Tutorials

Lorem ipsum dolor sit amet, consectetur

View Tutorials

Resources

Lorem ipsum dolor sit amet, consectetur

View Resources