Asankhaya Sharma
Asankhaya Sharma
When we enable suggestion with the PRReview flow I think the partitioning has some issue see - https://github.com/codelion/example-python/pull/37#issuecomment-2068508947
@CTY-git can we add a flag that will re-embed the repository? At the moment, if the embedding already exists and we switch the embedding model it fails with an error....
I had the same issue today, installing transformers 4.35.2 seems to have worked.
@Data-Scientist-Sahil There is some issue if the repo is not already created. See my comment here - https://discord.com/channels/1179035537009545276/1179035537529643040/1271746355093831793 see if you are on latest transformers.
Just join the unsloth discord, you may get better support there. This was my comment - ``` The simplest thing may be to just do it in 2 steps. Finish...
Start a new session, just load the model from folder and push to hub.
Hi @femto I had a quick look at your repo, It looks interesting. It will be good to see some results on standard benchmarks to compare the difference between the...
This is good, can you also update it for moa, mcts and pvg approaches? It should then fix #67 .
Another request for something like this with some discussion is here #99
You can take a look at OptiLLM - https://github.com/codelion/optillm it has an inbuilt local inference server so you can directly run any LLM as an OpenAI API compatible endpoint.