Shortcuts

RayWeightSyncScheme

class torchrl.weight_update.RayWeightSyncScheme(strategy: Literal['state_dict', 'tensordict'] = 'state_dict')[source]

Weight synchronization for Ray distributed computing.

This scheme uses Ray’s object store and remote calls to synchronize weights across distributed workers (Ray actors).

Each remote collector gets its own transport, following the same pattern as multiprocess collectors.

create_receiver() WeightReceiver

Create a receiver for this scheme.

Returns:

WeightReceiver instance configured for this scheme.

create_sender() WeightSender

Create a sender for this scheme.

Returns:

WeightSender instance configured for this scheme.

create_transport(pipe_or_context: Any) TransportBackend[source]

Create Ray-based transport for a specific remote collector.

Parameters:

pipe_or_context – The Ray actor handle for the remote collector.

Returns:

RayTransport configured for this specific remote collector.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources