Shortcuts

RayModuleTransformSender

class torchrl.weight_update.RayModuleTransformSender(scheme: RayModuleTransformScheme)[source]

Specialized sender for RayModuleTransform actors.

This sender handles weight updates for models hosted within Ray actors. Unlike the base WeightSender which uses pipes for multiprocessing, this sender directly communicates with Ray actors via their remote methods.

For Ray actors, there is typically only one shared actor instance, so we store a single transport rather than per-worker transports.

register_worker(worker_idx: int, pipe_or_context: Any) None[source]

For Ray actors, worker registration is a no-op.

Ray actors are shared across all workers, so we don’t need per-worker transports. The actor reference is resolved lazily on first use.

set_context(context: Any, model_id: str) None[source]

Set context for lazy actor resolution.

Parameters:
  • context – The collector instance.

  • model_id – String path to the Ray actor (e.g., “env.transform[0]”).

update_weights(weights: Any) None[source]

Send weights to the Ray actor.

Parameters:

weights – Weights to send.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources