Shortcuts

RPCTransport

class torchrl.weight_update.RPCTransport(collector_info=None, collector_rref=None, collector_class=None)[source]

RPC transport for communicating with a single RPC remote collector.

This transport handles weight updates for ONE specific remote collector via torch.distributed.rpc. Multiple transports are created for multiple collectors, following the same pattern as multiprocess collectors.

check_connection() bool[source]

Check if RPC is initialized.

receive_weights(timeout: float = 1.0) tuple[str, Any] | None[source]

RPC workers typically don’t receive weights through this transport.

send_weights(model_id: str, weights: Any) None[source]

Send weights to the remote collector via RPC.

Note: We don’t pass model_id to the remote collector because remote collectors don’t have weight senders - they apply weights directly to their local policy.

send_weights_async(model_id: str, weights: Any) None[source]

Send weights to remote collector without waiting for completion.

Use wait_ack() to wait for completion after sending to all workers.

wait_ack() None[source]

Wait for the RPC call to complete.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources