continue
continue copied to clipboard
add llamastack support
Description
Add llamastack support to Continue. Most of the functions are just calling the openai compatible endpoints, except the completion endpoint for AutoComplete are customized as openai did not have completion anymore.
Checklist
- [] I've read the contributing guide
- [] The relevant docs, if any, have been updated or created
- [] The relevant tests, if any, have been updated or created
Screenshots
Tested and passing 7 tests, the failing one is due to model problem.
Deploy request for continuedev pending review.
Visit the deploys page to approve it
| Name | Link |
|---|---|
| Latest commit | b9aaad51934dd114e4d13484869fdca7710fdf37 |
All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.
😱 Found 1 issue. Time to roll up your sleeves! 😱
I have read the CLA Document and I hereby sign the CLA
@sestinj I tried to use my branch to build a new plugin for testing but somehow I can not get my new code installed in the VScode. Following guidance here,I installed the new built plugin successfully, but Continue extension just can not be loaded in VScode. Can you help me on this? Thanks!
@wukaixingxp Can you share error logs or anything? I want to help but am going to need more information
https://docs.continue.dev/troubleshooting
@wukaixingxp Can you share error logs or anything? I want to help but am going to need more information
https://docs.continue.dev/troubleshooting This is a screenshot when I tried to debug in the Host VS Code, following this guide.
@wukaixingxp It looks like there are type errors (you can see them in the failing tests above). You should be able to reproduce these locally with npm run build inside of the core folder. Look at other examples of openai-adapters to see how to fix this (I think it is a problem with the constructor). There is a chance that this is also what is stopping the extension from running locally
@wukaixingxp It looks like there are type errors (you can see them in the failing tests above). You should be able to reproduce these locally with
npm run buildinside of thecorefolder. Look at other examples of openai-adapters to see how to fix this (I think it is a problem with the constructor). There is a chance that this is also what is stopping the extension from running locally
@sestinj Thanks for pointing this out. I fixed the bug and managed to get all the CI tests passed. I need your help on (1) Is there any way to preview the model-providers webpage like this one with my code to make sure everything looks good? (2) I build the extension on my PR and able to run some manual test after installing it. But I felt like the code context is not showing up.. Maybe something is wrong with the latest code? Happy to discussing this on discord if possible.
@wukaixingxp Yup, for (1) you can cd docs && npm run start to start a local docs server. It will then live update when you change the markdown.
I felt like the code context is not showing up.
What do you mean by this?
@sestinj Given that my PR passed all the tests, can we have it merged ASAP? Thanks!
@wukaixingxp Yup, for (1) you can
cd docs && npm run startto start a local docs server. It will then live update when you change the markdown.I felt like the code context is not showing up.
What do you mean by this?
NVM.. I think it may be my local branch problem..
Yup, great work!
:tada: This PR is included in version 1.1.0 :tada:
The release is available on:
Your semantic-release bot :package::rocket: