continue
continue copied to clipboard
Integrating PiecesOS LLM in Continue
Description
Added support for Pieces OS as a new LLM provider in Continue. This integration allows users to utilize Pieces OS for completions and other supported operations within the Continue environment.
Checklist
- [ ] The base branch of this PR is
dev, rather thanmain - [x] The relevant docs have been created and updated
Details of Changes
- Created
piecesOSLLM.tsin thecore/llm/llmsdirectory - Implemented
PiecesOSLLMclass extendingBaseLLM - Set up PiecesOS provider and client
- Implemented key methods:
_convertArgs,_streamComplete,_call - Added getters for LLM type and identifying parameters
- Updated
index.tsto include PiecesOS - Updated
config_schema.jsonfor PiecesOSLLM configuration - Created
piecesOSLLM.mddocumentation - Added setup instructions, configuration details, usage guidelines, and limitations
Deploy Preview for continuedev failed. Why did it fail? →
| Name | Link |
|---|---|
| Latest commit | 56a5ad4fe319bbe649f9fcb16fa79c26ea153ea8 |
| Latest deploy log | https://app.netlify.com/sites/continuedev/deploys/66e9c558e72a020008ae5bfa |
@sestinj could you share your thoughts regarding this
Hi @sambhavnoobcoder—sorry for the delay. I'm looking into Pieces OS more and thinking that something even better than this PR might be OpenAI-compatibility so that a custom provider isn't needed. We're reaching a point where there are such a large number of LLM providers to maintain that it is likely to cause many of them to fall out of regular maintenance. The more providers can use OpenAI-compatible servers, the more stable and up-to-date support we can offer for them, and I'd strongly prefer to do this for Pieces.
Is Pieces OS OpenAI-compatible, or is this a change that could be made? I would estimate that this would require fewer lines of code within PiecesOS than it would in this current PR
Hi @sambhavnoobcoder, since it's been a while and I think that the OpenAI-compatibility approach would be much better, I'm going to close this PR for the time being. Happy to take another look if you have the chance to look into the OpenAI-compatible route