Shortcuts

torchrl.trainers.algorithms.configs.utils.SparseAdamConfig

class torchrl.trainers.algorithms.configs.utils.SparseAdamConfig(lr: float = 0.001, betas: tuple[float, float] = (0.9, 0.999), eps: float = 1e-08, _target_: str = 'torch.optim.SparseAdam', _partial_: bool = True)[source]

Configuration for SparseAdam optimizer.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources