continue
continue copied to clipboard
support for Azure ML endpoints
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows
- Continue: v0.8.22
- IDE:VS Code
Description
I am looking into integrating the LLM models hosted in Azure ML studio. However, i could not find the provider configuration for Azure ML Studio. I tried changing the Azure OpenAI configuration and replacing the Azure ML Endpoint, however, it doesn't work.
To reproduce
Open config.json configure Azure ML Studio endpoint, specify provider as "openai" Try to do a chat testing.
Log output
notificationsAlerts.ts:42 HTTP 424 Failed Dependency from https://visagan-continue-poc-heati.eastus2.inference.ml.azure.com/openai/deployments/codellama-7b-python-hf-6/chat/completions?api-version=2023-07-01-preview {"detail":"Not Found"}
@visagansanthanam-unisys we have a bit of extra documentation here about how to setup with Azure OpenAI Service. Let me know if that doesn't end up being the solution to your problem and I'll take a look right away!
In either case, we can look at how documentation/setup experience might be improved
@sestinj I am trying to connect self-hosted models like llama, starcoder in Azure ML Studio and connect to Continue plugin. Do we have a provider for open-source models hosted in Azure ML Studio?
@visagansanthanam-unisys I believe that models deployed via Azure ML Studio all have different input formats, which means there is no good way for us to have built-in support, though it is possible to build a CustomLLM using config.ts.
Are there constraints that led you to choose Azure ML Studio over other options which might more easily be integrated?