Shortcuts

RPCWeightSyncScheme

class torchrl.weight_update.RPCWeightSyncScheme(strategy: Literal['state_dict', 'tensordict'] = 'state_dict')[source]

Weight synchronization for torch.distributed.rpc.

This scheme uses RPC calls to synchronize weights across distributed workers. Each remote collector gets its own transport, following the same pattern as multiprocess collectors.

create_receiver() WeightReceiver

Create a receiver for this scheme.

Returns:

WeightReceiver instance configured for this scheme.

create_sender() WeightSender

Create a sender for this scheme.

Returns:

WeightSender instance configured for this scheme.

create_transport(pipe_or_context: Any) TransportBackend[source]

Create RPC-based transport for a specific remote collector.

Parameters:

pipe_or_context – A tuple of (collector_info, collector_rref, collector_class) for the remote collector.

Returns:

RPCTransport configured for this specific remote collector.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources