TransportBackend¶
- class torchrl.weight_update.TransportBackend(*args, **kwargs)[source]¶
Abstract interface for different communication mechanisms.
- receive_weights(timeout: float | None = None, *, weights: Any = None, model: Any = None, strategy: torchrl.weight_update.weight_sync_schemes.WeightStrategy | None = None) Any | None[source]¶
Receive weights from the sender and apply them to the model.
- Parameters:
timeout – Maximum time to wait for weights (seconds). None means no timeout (blocking). Some transports may not support timeout and will raise ValueError if specified.
weights – Pre-allocated weight buffer to receive into.
model – The model to apply weights to.
strategy – Strategy for applying weights to the model.
- Returns:
The received/applied weights, or None if timeout/no weights available.
- setup_connection_and_weights_on_receiver(*, worker_idx: int, weights: Any = None, model: Any = None, strategy: torchrl.weight_update.weight_sync_schemes.WeightStrategy | None = None) Any[source]¶
Synchronize weights on worker side before collection starts.
This is called once in each worker after initialization to receive the initial weights. This is a no-op (weights are received via receive_weights).
- Parameters:
worker_idx – The worker index.
weights – Pre-allocated weight buffer to receive into.
model – The model to apply weights to.
strategy – Strategy for applying weights to the model.
- Returns:
The received weights (for SharedMemTransport) or None.