Ask-Anything icon indicating copy to clipboard operation
Ask-Anything copied to clipboard

Langchain uses wrong OpenAI endpoint

Open JeffJassky opened this issue 1 year ago • 9 comments

Context:

Using video_chat

Error:

**openai.error.InvalidRequestError: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?**

Work around:

In chatbot.py:

  1. import OpenAiChat instead of OpenAi from langchain.llms.openai on line 4.
  2. Use self.llm = OpenAIChat(...) instead of self.llm = OpenAI(...) on line 70.

JeffJassky avatar May 08 '23 21:05 JeffJassky

What's your langchain version?

yinanhe avatar May 09 '23 05:05 yinanhe

langchain 0.0.101 (as defined in video_chat/requirements.txt) openai 0.27.6

JeffJassky avatar May 09 '23 19:05 JeffJassky

Context:

Using video_chat

Error:

**openai.error.InvalidRequestError: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?**

Work around:

In chatbot.py:

  1. import OpenAiChat instead of OpenAi from langchain.llms.openai on line 4.
  2. Use self.llm = OpenAIChat(...) instead of self.llm = OpenAI(...) on line 70.

Had the same issue, that solution did worked for me, thank you!

ShaiShmuel avatar May 10 '23 07:05 ShaiShmuel

The chat module is renamed to ChatOpenAI (langchain==0.0.228, openai==0.27.8). The following worked for me.

from langchain.chat_models import ChatOpenAI
llm = ChatOpenAI(...)

xuf12 avatar Jul 12 '23 18:07 xuf12

The chat module is renamed to ChatOpenAI (langchain==0.0.228, openai==0.27.8). The following worked for me.

from langchain.chat_models import ChatOpenAI
llm = ChatOpenAI(...)

Made my Day 💯

tkreuder avatar Jul 17 '23 10:07 tkreuder

Recent versions of langchain now suggest importing ChatOpenAI from langchain_community.chat_models instead of langchain.chat_models:

UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain_community.chat_models import ChatOpenAI`
  warnings.warn(

So it should be

from langchain_community.chat_models import ChatOpenAI
llm = ChatOpenAI(...)

XieJiSS avatar Jan 03 '24 03:01 XieJiSS

@xuf12 can you please suggest me the code to use gpt-4 model with langchain. i'm using like this: from langchain.chat_models import ChatOpenAI from langchain.chains import ConversationChain from langchain.memory import ConversationBufferMemory from langchain import LLMChain from langchain.prompts.prompt import PromptTemplate import os os.environ["OPENAI_API_KEY"] = '******' llm = ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo')

thanks

Stoik-Reddy avatar Jan 05 '24 11:01 Stoik-Reddy

@Stoik-Reddy Change your model_name to "gpt-4" is fine.

yinanhe avatar Jan 24 '24 06:01 yinanhe

Context:

Using video_chat

Error:

**openai.error.InvalidRequestError: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?**

Work around:

In chatbot.py:

  1. import OpenAiChat instead of OpenAi from langchain.llms.openai on line 4.
  2. Use self.llm = OpenAIChat(...) instead of self.llm = OpenAI(...) on line 70.

i did modifications as of below

initially => from langchain import OpenAI modification => from langchain.chat_models import ChatOpenAI

initially => llm = OpenAI(temperature=0.9, max_tokens=500,model_name="gpt-4") modification => llm = ChatOpenAI(temperature=0.9, max_tokens=500,model_name="gpt-4")

makaveli006 avatar Jul 21 '24 14:07 makaveli006