Asankhaya Sharma

Results 47 comments of Asankhaya Sharma

yes, you can add a new plugin that supports this. Add it to the plugins directory and submit a PR.

The following code shows how to use the OpenAI compatible API to get logprobs from optillm: ``` messages=[ { "role": "user", "content": "How many rs are there in strawberry? Use...

Hi @AdityaPandeyCN for the UI we can have a simple chat-like interface that connects with the optillm proxy and allows users to choose the approaches and try out queries. Something...

You can still continue working on that and we will add that in when ready. I just added this today since I saw -https://www.gradio.app/guides/creating-a-chatbot-fast#quickly-loading-from-ollama-or-any-open-ai-api-compatible-endpoint with the new release of gradio....

This would be a good addition, the z3 reasoning implemented in optillm now also includes sympy. It will be interesting to add more solvers/formal tools that can help with reasoning....

Can you run it locally? It doesn't run in headless mode by default. You have to enter captchas that pop up time to time so it opens an actual browser...

You can try using optillm - https://github.com/codelion/optillm in fact we recently implemented something similar for reasoning LLMs like deepseek r1 here - https://github.com/codelion/optillm/blob/main/optillm/thinkdeeper.py

@klieret any idea when this will be fixed? is there a workaround until then?

Can you not create a new block that has 2 inputs and 1 output and then use that to combine two outputs into one.

I have actually fine-tuned a local model to apply changes from other LLMs to code. You can take a look at - https://huggingface.co/patched-codes/Llama-3.2-1B-FastApply The model is trained to merge changes...