Alvaro Bartolome
Alvaro Bartolome
Cool @matthias-wright! I'm also busy these days (also some bank holidays in Spain next week) but I'll try to submit a PR in case you're interested, your project is indeed...
Also regarding how "complicated" that could be in terms of code, I'd say that it's just unpickling the weights, converting to `safetensors`, uploading file to S3/Dropbox or whatever storage you...
Hi @arnavmehta7! Maybe you can escalate that to the PyTorch team? Since here we're just showcasing how to use https://github.com/pytorch/serve, but maybe the issue is in my code I'm not...
> Hi @alvarobartt, didn't have the time to review thoroughly yet, but could you add some unit tests? Hey @gabrielmbmb sorry if this was not clear (see commit message above)...
> Nice! Let's see how we highlight this in the docs, but looks good to me Sure, the idea is to include this in the docs once we decide what...
Hi here @amritsingh183! Thanks for opening the issue, indeed we're already working on this as well as aligning the supported params for other LLM providers too, I'll link the PR...
Hi here @amritsingh183, the PR is still a draft but you can use it for `n_ctx` with no issues now! Install it from the branch as `pip install git+https://github.com/argilla-io/distilabel.git@align-llm-params` 👍🏻...
Indeed this has just been merged into `develop`, so feel free to install it from `develop` instead 👍🏻 https://github.com/argilla-io/distilabel/pull/594
Hi @kouyakamada maybe you can try running it with the following environment variable set `DISTILABEL_LOG_LEVEL=DEBUG`? This way we may understand better why is it freezing, and also if you could...
Hi again @kouyakamada we've tried to reproduce and we were unable, so we're closing this now assuming that you got it working with @plaguss suggestions 👍🏻 Otherwise, feel free to...