Steve

Results 78 comments of Steve

in the docker (there is a DOCKERFILE), is it possible to install pandoc?

thanks, indeed, I had an issue to load it. Also, is it possible to remove the dependency on cuda? I expect the embeddings and llm to run on a different...

1. What about leveraging [litellm](https://github.com/BerriAI/litellm) to fix this issue? 2. Is it possible in the interim to set the base_url of the openai endpoint? We could still use litellm as...

thanks for the confirmation. One way to achieve what you want could be using the[ on conflict do nothing](https://www.postgresql.org/docs/current/sql-insert.html#SQL-ON-CONFLICT) clause.

going back to documentation from: https://docs.litellm.ai/docs/providers/cloudflare_workers 1. Could we get an example if we are using litellm as a proxy? 2. It would help if in the debug we could...

it does not show up; is it expected to work if I already have the main duplicated plugin installed?

I removed the previous plugin, and reinstalled this one from github but I still can't find it within the PreferencesPlugins UI screen.

Maybe I am missing what I could change and what I should expect. Are you saying that when I select a different name in the dropdown, I should hear somebody...

if the openAI connection fails and we restart the app, we keep getting the same message (i.e.: Failed to get response from OpenAI after 5 attempts). Definitely the app is...

OS: VM linux I am trying to leverage ollama/litellm openAI compatible endpoints to test further data-to-paper. It means I need to enforce the openai base url. I tried changing env.py...