Shortcuts

RayModuleTransformReceiver

class torchrl.weight_update.RayModuleTransformReceiver(scheme: RayModuleTransformScheme)[source]

Specialized receiver for RayModuleTransform actors.

This receiver handles weight updates within Ray actors. Since Ray actors receive weights through direct method calls, this receiver primarily validates and applies weights locally.

apply_weights(weights: Any) None[source]

Apply received weights to registered model.

For Ray actors, weights are applied directly to the module within the actor’s process space.

Parameters:

weights – The weights to apply.

receive(timeout: float = 0.001) bool

Check for and apply new weights (non-blocking).

This method is called in the worker’s main loop to check if new weights have been sent. If weights are available, they are applied to the registered model immediately.

Parameters:

timeout – Maximum time to wait for weights (seconds). Use 0 for immediate return.

Returns:

True if weights were received and applied False if no weights were available

Note: For SharedMemWeightSyncScheme, this always returns False since workers automatically see updates via shared memory.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources