WeightSender¶
- class torchrl.weight_update.WeightSender(scheme: WeightSyncScheme)[source]¶
Sends weights for ONE model to workers.
A single sender can broadcast to all workers or send to specific workers. Created and managed by WeightSyncScheme. Users should not instantiate directly.
- send(weights: Any = None, worker_ids: int | list[int] | None = None) None[source]¶
Send weights synchronously to workers.
This method: 1. Prepares weights (extracts from model if weights=None) 2. Sends to specified workers (or all if worker_ids=None) 3. Waits for acknowledgments from those workers 4. Returns when workers have applied the weights
- Parameters:
weights – Weights to send. Can be: - None: Extract from model via context.get_model(model_id) - nn.Module: Extract weights from module - TensorDict: Use directly - dict: Convert to TensorDict
worker_ids – Which workers to send to: - None: Send to all workers (default) - int: Send to single worker - list[int]: Send to specific workers
Note: This is a blocking call that ensures specified workers are updated before returning.
- send_async(weights: Any = None, worker_ids: int | list[int] | None = None) None[source]¶
Send weights asynchronously to workers (non-blocking).
This initiates the send but returns immediately without waiting for workers to acknowledge. You must call wait_async() before the next send_async() or send() call.
- Parameters:
weights – Same as send()
worker_ids – Same as send()
- Raises:
RuntimeError – If a previous send_async() is still pending