ClearCudaCache

class torchrl.trainers.ClearCudaCache(interval: int)[source]

Clears cuda cache at a given interval.

Examples

>>> clear_cuda = ClearCudaCache(100)
>>> trainer.register_op("pre_optim_steps", clear_cuda)
abstract register(trainer: Trainer, name: str)

Registers the hook in the trainer at a default location.

Parameters:
  • trainer (Trainer) – the trainer where the hook must be registered.

  • name (str) – the name of the hook.

Note

To register the hook at another location than the default, use register_op().

Docs

Lorem ipsum dolor sit amet, consectetur

View Docs

Tutorials

Lorem ipsum dolor sit amet, consectetur

View Tutorials

Resources

Lorem ipsum dolor sit amet, consectetur

View Resources