Shortcuts

VLLMDoubleBufferWeightSender

class torchrl.weight_update.llm.VLLMDoubleBufferWeightSender(scheme: VLLMDoubleBufferSyncScheme)[source]

Sends weights to vLLM workers using double-buffered storage.

This sender extracts weights from a training model and writes them to a shared directory using TensorDict.memmap.

Example

>>> sender = scheme.create_sender()
>>> sender.register_model(policy_model)
>>>
>>> # During training loop
>>> sender.update_weights()  # Writes current weights to shared storage
register_model(model: Any) None[source]

Register the model to extract weights from.

Parameters:

model – The model to extract weights from (e.g., TransformersWrapper).

update_weights(weights: Any | None = None) None[source]

Extract and write weights to shared storage.

Parameters:

weights – Optional weights to send. If None, extracts from registered model.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources