sandeep
sandeep
I've been facing NCCL timeouts and even after setting timeout to larger value in `AcceleratorState`, I still face this error. Finally, I see that this is on the deepspeed side....
The WANDB job : https://wandb.ai/sharma-sandeepch/trlx/runs/f7ym4m9y?workspace=user-sharma-sandeepch _(updated the link to point to a run with a larger batch size)_
@PhungVanDuy @maxreciprocate I cannot seem to request a review (probably due to permission issues / first contribution reasons). Could you please advise?
Thank you so much @PhungVanDuy for reviewing 🙏 Yes it's the same wandb run i shared above. Here you go : https://wandb.ai/sharma-sandeepch/trlx/runs/f7ym4m9y?workspace=user-sharma-sandeepch
> > @PhungVanDuy @maxreciprocate I cannot seem to request a review (probably due to permission issues / first contribution reasons). Could you please advise? > > Thank you so much...
> > > > @PhungVanDuy @maxreciprocate I cannot seem to request a review (probably due to permission issues / first contribution reasons). Could you please advise? > > > >...
@PhungVanDuy sorry for the delay, the gpus aren't always available. Here is a dpo run (ongoing) of 1 epoch with `mistral-7b-sft-beta` on the `ultrafeedback_binarized` dataset : https://wandb.ai/sharma-sandeepch/trlx/runs/kfpmeonf?workspace=user-sharma-sandeepch **Note** : -...
Hi, is this something that is still open to work on? I would like to pick it up if that is okay :) @CSerxy I've just forked and begun work...
@ekinsenler Tried this with `"headless": False` and `Xvfb` and still yields the same results. Is this anything to do with `SmartScraperGraph` which scrapes the `source` url only and instead i...
@ekinsenler Hmm... I'm curious - Is this not a functional example yet? There seems to be a `max_depth` parameter. However, I am facing errors using this Scraper https://github.com/ScrapeGraphAI/Scrapegraph-ai/blob/2333b513aafae3c358225a8f82f6c01964c0514e/examples/openai/deep_scraper_openai.py