Shortcuts

RayTransport

class torchrl.weight_update.RayTransport(remote_collector=None)[source]

Ray transport for communicating with a single Ray collector actor.

This transport handles weight updates for ONE specific remote collector. Multiple transports are created for multiple collectors, following the same pattern as multiprocess collectors.

check_connection() bool[source]

Check if Ray is initialized.

receive_weights(timeout: float = 1.0) tuple[str, Any] | None[source]

Ray workers typically don’t receive weights through this transport.

send_weights(model_id: str, weights: Any) None[source]

Send weights to the remote collector via Ray.

Note: We don’t pass model_id to the remote collector because remote collectors don’t have weight senders - they apply weights directly to their local policy.

send_weights_async(model_id: str, weights: Any) None[source]

Send weights to remote collector without waiting for completion.

Use wait_ack() to wait for completion after sending to all workers.

wait_ack() None[source]

Wait for the remote collector to finish applying weights.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources