Shortcuts

DistributedWeightSyncScheme

class torchrl.weight_update.DistributedWeightSyncScheme(backend: str = 'gloo', sync: bool = True)[source]

Weight synchronization for torch.distributed.

This scheme uses torch.distributed primitives (send/recv) to synchronize weights across distributed workers. Each worker gets its own transport, following the same pattern as multiprocess collectors.

Parameters:
  • backend (str) – The distributed backend (“gloo”, “nccl”, etc.)

  • sync (bool) – Whether to use synchronous weight updates

create_receiver() WeightReceiver

Create a receiver for this scheme.

Returns:

WeightReceiver instance configured for this scheme.

create_sender() WeightSender

Create a sender for this scheme.

Returns:

WeightSender instance configured for this scheme.

create_transport(pipe_or_context: Any) TransportBackend[source]

Create distributed transport for a specific worker.

Parameters:

pipe_or_context – A tuple of (store, rank) for the worker.

Returns:

DistributedTransport configured for this specific worker.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources