continue icon indicating copy to clipboard operation
continue copied to clipboard

support for Azure ML endpoints

Open visagansanthanam-unisys opened this issue 10 months ago • 3 comments

Before submitting your bug report

Relevant environment info

- OS: Windows
- Continue: v0.8.22
- IDE:VS Code

Description

I am looking into integrating the LLM models hosted in Azure ML studio. However, i could not find the provider configuration for Azure ML Studio. I tried changing the Azure OpenAI configuration and replacing the Azure ML Endpoint, however, it doesn't work.

image

To reproduce

Open config.json configure Azure ML Studio endpoint, specify provider as "openai" Try to do a chat testing.

Log output

notificationsAlerts.ts:42 HTTP 424 Failed Dependency from https://visagan-continue-poc-heati.eastus2.inference.ml.azure.com/openai/deployments/codellama-7b-python-hf-6/chat/completions?api-version=2023-07-01-preview  {"detail":"Not Found"}

visagansanthanam-unisys avatar Apr 04 '24 09:04 visagansanthanam-unisys

@visagansanthanam-unisys we have a bit of extra documentation here about how to setup with Azure OpenAI Service. Let me know if that doesn't end up being the solution to your problem and I'll take a look right away!

In either case, we can look at how documentation/setup experience might be improved

sestinj avatar Apr 05 '24 06:04 sestinj

@sestinj I am trying to connect self-hosted models like llama, starcoder in Azure ML Studio and connect to Continue plugin. Do we have a provider for open-source models hosted in Azure ML Studio?

visagansanthanam-unisys avatar Apr 05 '24 08:04 visagansanthanam-unisys

@visagansanthanam-unisys I believe that models deployed via Azure ML Studio all have different input formats, which means there is no good way for us to have built-in support, though it is possible to build a CustomLLM using config.ts.

Are there constraints that led you to choose Azure ML Studio over other options which might more easily be integrated?

sestinj avatar Apr 08 '24 19:04 sestinj