Skip to content
Snippets Groups Projects
Commit 8bf38df0 authored by Qiang Zhang's avatar Qiang Zhang Committed by Kai Chen
Browse files

Only import torch.distributed when needed (#882)

* Fix an import error for `get_world_size` and `get_rank`

* Only import torch.distributed when needed

torch.distributed is only used in DistributedGroupSampler

* use `get_dist_info` to obtain world size and rank

`get_dist_info` from `mmcv.runner.utils` handles the problem of `distributed_c10d` doesn't exist.
parent f080ccbe
No related branches found
No related tags found
No related merge requests found
......@@ -4,7 +4,7 @@ import math
import torch
import numpy as np
from torch.distributed import get_world_size, get_rank
from mmcv.runner.utils import get_dist_info
from torch.utils.data import Sampler
from torch.utils.data import DistributedSampler as _DistributedSampler
......@@ -95,10 +95,11 @@ class DistributedGroupSampler(Sampler):
samples_per_gpu=1,
num_replicas=None,
rank=None):
_rank, _num_replicas = get_dist_info()
if num_replicas is None:
num_replicas = get_world_size()
num_replicas = _num_replicas
if rank is None:
rank = get_rank()
rank = _rank
self.dataset = dataset
self.samples_per_gpu = samples_per_gpu
self.num_replicas = num_replicas
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment