Home
last modified time | relevance | path

Searched defs:set_requires_gradient_sync (Results 1 – 2 of 2) sorted by relevance

/aosp_15_r20/external/pytorch/torch/distributed/_composable/
H A Dreplicate.py163 def set_requires_gradient_sync(self, requires_gradient_sync: bool) -> None: member in DDP
/aosp_15_r20/external/pytorch/torch/distributed/_composable/fsdp/
H A Dfully_shard.py226 def set_requires_gradient_sync( member in FSDPModule