continue
continue copied to clipboard
Self Hosted App with NodeJs Express
Before submitting your bug report
- [ ] I believe this is a bug. I'll try to join the Continue Discord for questions
- [ ] I'm not able to find an open issue that reports the same bug
- [ ] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: macOs 14.2.1
- Continue: 0.8.14
- IDE: 1.81.1
Description
Hi Folks,
I just set a NodeJs application with npm library code-llama-cpp. I have downloaded mistral-7b LLM from hugging face. Started server on localhost 3000 port. created an endpoint /completion.
Configuration added in Continue dev is
{ "title": "Local", "provider": "llama.cpp", "model": "mistral-7b", "apiBase": "http://localhost:3000/" }
When I am typing in continue dev chat prompt API calls reaching to the server with request body
{ prompt: '[INST] hot to git clone [/INST]', stream: true, n_predict: 1024 }
After processing the prompt value the API response returning as json format
{ content: [{ type: "text", text: <my LLM response> }] }
But this response content not getting printed/embedded in the continue dev extension. Please help me on this. What would be the ideal API response format expected by continue.dev extension?
To reproduce
No response
Log output
No response
Related issue: https://github.com/continuedev/continue/issues/471#issuecomment-1718674693 Can someone please help here? What is the correct specification that the response must satisfy? The response is not being displayed in Continue frontend.
@murshidav this helps. A good Samaritan from the Discord server helped me out https://github.com/sestinj/slow-server/blob/main/main.py
@murshidav You can find the source code that parses the response here. If you can build an OpenAI-compatible server this may be a better approach, since it is becoming an industry standard. As mentioned in the comment above that repo shows an example of such a server