langchain icon indicating copy to clipboard operation
langchain copied to clipboard

ChatOpenAI: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>

Open sahand68 opened this issue 1 year ago • 3 comments

System Info

langchain==0.0.169 openai==0.27.6

Who can help?

@hwchase17 @agola11 @vowelparrot

Information

  • [ ] The official example notebooks/scripts
  • [ ] My own modified scripts

Related Components

  • [X] LLMs/Chat Models
  • [ ] Embedding Models
  • [ ] Prompts / Prompt Templates / Prompt Selectors
  • [ ] Output Parsers
  • [ ] Document Loaders
  • [ ] Vector Stores / Retrievers
  • [ ] Memory
  • [ ] Agents / Agent Executors
  • [ ] Tools / Toolkits
  • [ ] Chains
  • [ ] Callbacks/Tracing
  • [ ] Async

Reproduction

from langchain.embeddings import OpenAIEmbeddings from dotenv import load_dotenv load_dotenv('.env')

ChatOpenAI(temperature=0, max_tokens=500, model_name='gpt-3.5-turbo', openai_api_base = os.environ['OPENAI_API_BASE'] ).call_as_llm('Hi')

Expected behavior

[nltk_data] Downloading package stopwords to /home/sahand/nltk_data... [nltk_data] Package stopwords is already up-to-date! [nltk_data] Downloading package punkt to /home/sahand/nltk_data... [nltk_data] Package punkt is already up-to-date! Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'> Invalid API key.

sahand68 avatar May 19 '23 20:05 sahand68

I have same issue when I use the Azure OpenAI Embedding service.

from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(deployment="text-embedding-ada-002", model="text-embedding-ada-002")
text = "This is a test query."
query_result = embeddings.embed_query(text)
print(query_result)
Traceback (most recent call last):
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 55, in <module>
    main()
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 44, in main
    query_result = embeddings.embed_query(text)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 300, in embed_query
    embedding = self._embedding_func(text, engine=self.deployment)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 266, in _embedding_func
    return embed_with_retry(
           ^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 64, in embed_with_retry
    return _embed_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 62, in _embed_with_retry
    return embeddings.client.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
    raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'>

trend-ted-zhang avatar May 21 '23 01:05 trend-ted-zhang

llm = AzureOpenAI( deployment_name="text-davinci-003", model_name="text-davinci-003", temperature=0, openai_api_base=openai.api_base, openai_api_key=openai.api_key ) chain = load_qa_chain(llm, chain_type="stuff") lchain_result = chain.run({"input_documents": documents, "question": query, "return_only_outputs": True} )

Check the doco https://python.langchain.com/en/latest/modules/models/llms/integrations/azure_openai_example.html

darrynv avatar May 23 '23 09:05 darrynv

Hi everyone,

I don't know if this topic has been resolved, in my case I added the parameter "engine" into the function call

chat = ChatOpenAI(temperature=0.0, engine="gpt-35-turbo")
chat

And I got this warning.

WARNING! engine is not default parameter.
                    engine was transferred to model_kwargs.
                    Please confirm that engine is what you intended.

But it worked for me... and the AzureOpenAI servce answered correctly.

Regards

rnavarromatesanz avatar Jun 26 '23 17:06 rnavarromatesanz

I have same issue when I use the Azure OpenAI Embedding service.

from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(deployment="text-embedding-ada-002", model="text-embedding-ada-002")
text = "This is a test query."
query_result = embeddings.embed_query(text)
print(query_result)
Traceback (most recent call last):
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 55, in <module>
    main()
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 44, in main
    query_result = embeddings.embed_query(text)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 300, in embed_query
    embedding = self._embedding_func(text, engine=self.deployment)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 266, in _embedding_func
    return embed_with_retry(
           ^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 64, in embed_with_retry
    return _embed_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 62, in _embed_with_retry
    return embeddings.client.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
    raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'>

Anyone able to fix this issue with the OpenAI Embedding service?

Leonor-Fernandes avatar Jul 16 '23 15:07 Leonor-Fernandes

you only need this:

embeddings: OpenAIEmbeddings = OpenAIEmbeddings(
    openai_api_base= f"https://{AZURE_OPENAI_SERVICE}.openai.azure.com",
    openai_api_type='azure',
    deployment='text-embedding-ada-002',
    openai_api_key=AZURE_OPENAI_API_KEY,
    chunk_size=1,
)

query_result = embeddings.embed_query("is this issue solve?")

My deployment name and model name are the same: text-embedding-ada-002

DSgUY avatar Jul 17 '23 04:07 DSgUY

embeddings: OpenAIEmbeddings = OpenAIEmbeddings( openai_api_base= f"https://{AZURE_OPENAI_SERVICE}.openai.azure.com", openai_api_type='azure', deployment='text-embedding-ada-002', openai_api_key=AZURE_OPENAI_API_KEY, chunk_size=1, )

this worked for me

I think open AI SDK changed, because my code was working in the past.

levalencia avatar Jul 17 '23 11:07 levalencia

This worked for me, I added openai_api_type and I had to remove openai_api_version

return OpenAIEmbeddings(
    deployment="deployment-name",
    model="text-embedding-ada-002",
    openai_api_type='azure',
    chunk_size=1,
)

mrbusche avatar Jul 17 '23 17:07 mrbusche

@levalencia or @mrbusche are you having this bug? https://github.com/hwchase17/langchain/issues/7841

DSgUY avatar Jul 17 '23 21:07 DSgUY

openai_api_type='azure',

Yes, Adding openai_api_type='azure', fixes the issue

langchain==0.0.232

levalencia avatar Jul 26 '23 13:07 levalencia

I have same issue when I use the Azure OpenAI Embedding service.

from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(deployment="text-embedding-ada-002", model="text-embedding-ada-002")
text = "This is a test query."
query_result = embeddings.embed_query(text)
print(query_result)
Traceback (most recent call last):
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 55, in <module>
    main()
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/embedding.py", line 44, in main
    query_result = embeddings.embed_query(text)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 300, in embed_query
    embedding = self._embedding_func(text, engine=self.deployment)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 266, in _embedding_func
    return embed_with_retry(
           ^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 64, in embed_with_retry
    return _embed_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/langchain/embeddings/openai.py", line 62, in _embed_with_retry
    return embeddings.client.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tedzhang/Work/AI_Code/ChatWithPDF/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
    raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'>

I have the same issue exactly. did you find any solutions?

itsalwaysamir avatar Aug 17 '23 08:08 itsalwaysamir

embeddings: OpenAIEmbeddings = OpenAIEmbeddings(

openai_api_base= f"https://{AZURE_OPENAI_SERVICE}.openai.azure.com",

openai_api_type='azure',

deployment='text-embedding-ada-002',

openai_api_key=AZURE_OPENAI_API_KEY,

chunk_size=1,

)

this worked for me

I think open AI SDK changed, because my code was working in the past.

Yes here

levalencia avatar Aug 17 '23 08:08 levalencia

Hello Everyone, I am getting same error when I am trying to use GPT-4 from OpenAI directly.

ShreyashKumarpandey avatar Aug 26 '23 05:08 ShreyashKumarpandey

I finally solved it. Azure related variables were also loaded so it was making the library to raise Azure related requirements,

ShreyashKumarpandey avatar Aug 28 '23 14:08 ShreyashKumarpandey

The solutions are given for OpenAIEmbeddings. But I want to solve this error when I use HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2") and openai_api_type='azure'

can anyone help me? @levalencia

Krithika-Devi avatar Sep 19 '23 04:09 Krithika-Devi

openai_api_type is only needed when using OpenAIEmbeddings, I dont think its needed when using HuggingFaceEmbeddings, maybe you are missing another parameter? can you paste the code?

levalencia avatar Sep 19 '23 13:09 levalencia

passing the model deployment name like this worked for me: chat = ChatOpenAI(temperature=0.0, model_kwargs={"engine": model_name})

ksaurabh21 avatar Oct 27 '23 08:10 ksaurabh21

Hi, @sahand68,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. The issue is related to the ChatOpenAI module, where users are encountering an error message "Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>". There have been discussions and code snippets shared by several users, suggesting adding the 'engine' parameter or the 'openai_api_type' parameter to the function call. Additionally, one user mentioned that Azure related variables were causing the library to raise Azure related requirements. The issue is currently unresolved.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days. Thank you!

dosubot[bot] avatar Feb 09 '24 16:02 dosubot[bot]

Hi,

I haven't faced this issue anymore since new versions of langchain!

Best, Sahand

On Fri, Feb 9, 2024 at 8:03 AM dosubot[bot] @.***> wrote:

Hi, @sahand68 https://github.com/sahand68,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. The issue is related to the ChatOpenAI module, where users are encountering an error message "Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>". There have been discussions and code snippets shared by several users, suggesting adding the 'engine' parameter or the 'openai_api_type' parameter to the function call. Additionally, one user mentioned that Azure related variables were causing the library to raise Azure related requirements. The issue is currently unresolved.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days. Thank you!

— Reply to this email directly, view it on GitHub https://github.com/langchain-ai/langchain/issues/5000#issuecomment-1936188349, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJRG4YWTQL24ZAVBHQOYKZ3YSZCFDAVCNFSM6AAAAAAYIGMNWCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZWGE4DQMZUHE . You are receiving this because you were mentioned.Message ID: @.***>

sahand68 avatar Feb 09 '24 16:02 sahand68

🤖

Hi @sahand68,

Thank you for confirming that the issue is no longer present in the latest versions of LangChain. We appreciate your feedback. We will proceed to close the issue accordingly.

Best regards, [Your Name]

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Feb 09 '24 16:02 dosubot[bot]