LlamaIndexTS icon indicating copy to clipboard operation
LlamaIndexTS copied to clipboard

Cannot use Azure OpenAI embedding model.

Open xuhaodev opened this issue 1 year ago • 5 comments

I cannot use Azure OpenAI embedding model as:

const azureOpenaiLLM = new OpenAI({ model: "gpt-4", temperature: 0 }); const azureOpenaiEmbedding = new OpenAIEmbedding({ model: "text-embedding-ada-002", }); const serviceContext = serviceContextFromDefaults({ llm: azureOpenaiLLM, embedModel : azureOpenaiEmbedding });

looks like the value not pass into: Stack trace: Error: 400 The embeddings operation does not work with the specified model, gpt-35-turbo. Please choose different model and try again.

May I know how to use Azure OpenAI embedding model?

xuhaodev avatar Apr 02 '24 01:04 xuhaodev

I assume you set the Azure env variables, see https://ts.llamaindex.ai/modules/llms/available_llms/azure#environment-variables?

Then I would try calling the embedding model directly first:

  const texts = ["hello", "world"];
  const embeddings = await azureOpenaiEmbedding.getTextEmbeddingsBatch(texts);
  console.log(`\nWe have ${embeddings.length} embeddings`);

marcusschiesser avatar Apr 03 '24 02:04 marcusschiesser

@xuhaoruins when creating an instance of OpenAIEmbedding class, you need to make sure you are using EMBEDDING_MODEL as a deployment target:

  const azure = {
    azureADTokenProvider,
    deployment: process.env.AZURE_OPENAI_DEPLOYMENT ?? "gpt-35-turbo",
  };

  // configure LLM model
  Settings.llm = new OpenAI({
    azure,
  }) as any;

  azure.deployment = process.env.EMBEDDING_MODEL as string; // <-----

  Settings.embedModel = new OpenAIEmbedding({
    azure,
    model: process.env.EMBEDDING_MODEL,
    dimensions: process.env.EMBEDDING_DIM
      ? parseInt(process.env.EMBEDDING_DIM)
      : undefined,
  });

manekinekko avatar Jun 11 '24 13:06 manekinekko

@xuhaoruins does this mean that azure.deployment and model need to have the same value? If it's not possible to use e.g. my-embedding as deployment name then we would need to change that.

marcusschiesser avatar Jun 14 '24 09:06 marcusschiesser

Bump, facing the same challenge - it' unclear how one configures the Azure embedding model. Tried a couple of different things but it seem the llm endpoints are always used for embeddings which results in the same failure above.

BadRequestError: 400 The embeddings operation does not work with the specified model, gpt-4o-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993. at Function.generate (file:///C:/Users/matei/projects/exchange_chatbot/node_modules/openai/src/error.ts:72:14) at AzureOpenAI.makeStatusError (file:///C:/Users/matei/projects/exchange_chatbot/node_modules/openai/src/core.ts:462:21) at AzureOpenAI.makeRequest (file:///C:/Users/matei/projects/exchange_chatbot/node_modules/openai/src/core.ts:526:24) at processTicksAndRejections (node:internal/process/task_queues:95:5)

mateid avatar May 02 '25 15:05 mateid