Shortcuts

torchrl.trainers.algorithms.configs.collectors.MultiSyncCollectorConfig

class torchrl.trainers.algorithms.configs.collectors.MultiSyncCollectorConfig(create_env_fn: Any = '???', num_workers: int | None = None, policy: Any = None, policy_factory: Any = None, frames_per_batch: int | None = None, init_random_frames: int | None = 0, total_frames: int = -1, device: str | None = None, storing_device: str | None = None, policy_device: str | None = None, env_device: str | None = None, create_env_kwargs: dict | None = None, max_frames_per_traj: int | None = None, reset_at_each_iter: bool = False, postproc: ConfigBase | None = None, split_trajs: bool = False, exploration_type: str = 'RANDOM', set_truncated: bool = False, use_buffers: bool = False, replay_buffer: ConfigBase | None = None, extend_buffer: bool = False, trust_policy: bool = True, compile_policy: Any = None, cudagraph_policy: Any = None, no_cuda_sync: bool = False, weight_updater: Any = None, weight_sync_schemes: Any = None, track_policy_version: bool = False, local_init_rb: bool = False, _target_: str = 'torchrl.collectors.MultiSyncCollector', _partial_: bool = False)[source]

Configuration for multi-synchronous data collector (MultiSyncCollector).

Docs

Lorem ipsum dolor sit amet, consectetur

View Docs

Tutorials

Lorem ipsum dolor sit amet, consectetur

View Tutorials

Resources

Lorem ipsum dolor sit amet, consectetur

View Resources