Shortcuts

get_gradient_scaler

torchtune.utils.precision.get_gradient_scaler(fsdp: bool = False) Union[GradScaler, ShardedGradScaler][source]

Returns a gradient scaler for mixed-precision training.

Parameters:

fsdp (bool) – Whether FSDP is being used for training, in which case a shard-aware gradient scaler is returned.

Returns:

Gradient scaler object

Return type:

Union[GradScaler, ShardedGradScaler]

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources