Shortcuts

WeightReceiver

class torchrl.weight_update.WeightReceiver(scheme: WeightSyncScheme)[source]

Receives weights for ONE model in ONE worker.

Created and managed by WeightSyncScheme. Users should not instantiate directly.

apply_weights(weights: Any) None[source]

Apply received weights to registered model (legacy).

Parameters:

weights – The weights to apply.

Note

This is the legacy method. Use receive() in the worker loop instead.

receive(timeout: float = 0.001) bool[source]

Check for and apply new weights (non-blocking).

This method is called in the worker’s main loop to check if new weights have been sent. If weights are available, they are applied to the registered model immediately.

Parameters:

timeout – Maximum time to wait for weights (seconds). Use 0 for immediate return.

Returns:

True if weights were received and applied False if no weights were available

Note: For SharedMemWeightSyncScheme, this always returns False since workers automatically see updates via shared memory.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources