Perplexica
Perplexica copied to clipboard
None of these LLM APIs providers for ollama LLMs or Azure OpenAI LLMs or embeddings models or any other LLMs don't work at all despite those changes.
@ItzCrazyKns none of these LLM APIs providers for ollama LLMs or Azure OpenAI LLMs or embeddings models or any other LLMs don't work at all despite those changes per your docs cited there.
Originally posted by @arunkumarakvr in https://github.com/ItzCrazyKns/Perplexica/issues/10#issuecomment-2105953814
More context..
It seems the settings on the UI become unusable to leverage any existing or new LLM APIs to make this wrapper work properly.
It seems the settings on the UI become unusable to leverage any existing or new LLM APIs to make this wrapper work properly.
In summary, this wrapper never worked for me at all as neither OpenAI API nor Ollama are usable including the popular OpenAI compatible APIs due to the invalid settings features or what?
What error are you facing? How are you configuring it? share some logs, screen shots or something?
你是想要通过本地兼容openai api方式运行嘛