DeepSpeedExamples icon indicating copy to clipboard operation
DeepSpeedExamples copied to clipboard

Is the usage of DistributedSampler correct?

Open nickyi1990 opened this issue 1 year ago • 0 comments

https://github.com/microsoft/DeepSpeedExamples/blob/cd19b3bf1e5b60dd73b09c7463da4eedada1eed7/applications/DeepSpeed-Chat/training/step1_supervised_finetuning/main.py#L234

there are two parameters in DistributedSampler

        num_replicas (int, optional): Number of processes participating in
            distributed training. By default, :attr:`world_size` is retrieved from the
            current distributed group.
        rank (int, optional): Rank of the current process within :attr:`num_replicas`.
            By default, :attr:`rank` is retrieved from the current distributed
            group.

is the code correct when do not use these two paramters? will deepspeed handle this automatically for us?

nickyi1990 avatar Apr 15 '23 05:04 nickyi1990