shaksham
Results
1
comments of
shaksham
@teriterance @EmanuelaBoros @ashleylew This worked out for me : Try to send model_parallel_size parameter in llama.build :::: if distributed_training: rank = int(os.environ.get("RANK", "0")) world_size = int(os.environ.get("WORLD_SIZE", "1")) os.environ["RANK"] = str(rank)...