langchain
langchain copied to clipboard
Documentation not up to date
The current documentation https://langchain.readthedocs.io/en/latest/modules/agents/getting_started.html seems to not be up to date with version 0.0.117:
UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain.chat_models import ChatOpenAI`
EDIT: to be more precise, this only happens if I try to change model_name to "gpt-3.5-turbo"
I have the same warning.
UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain.chat_models import ChatOpenAI`
Here is my code
bot = VectorDBQA.from_llm(
k=4,
llm=OpenAI(model_name="gpt-3.5-turbo", temperature=0),
vectorstore=db,
return_source_documents=True,
)
Try using
from langchain.chat_models import ChatOpenAI
llm = ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo')
It also throws that error for LLMMathChain when I had to use OpenAI() as llm parameter
Edit: Nevermind, instead of using:
from langchain.llms import OpenAI
It should be:
from langchain import OpenAI
I did the same thing:
from langchain import OpenAI
# from langchain.chat_models import ChatOpenAI
llm = OpenAI(model_name="gpt-4", temperature=0.9)
# llm = ChatOpenAI(model_name="gpt-4", temperature=0.9)
I still get the same errors:
[/Users/nick/Library/Caches/pypoetry/virtualenvs/chatmfl-wJULhpjL-py3.10/lib/python3.10/site-packages/langchain/llms/openai.py:169](https://file+.vscode-resource.vscode-cdn.net/Users/nick/Library/Caches/pypoetry/virtualenvs/chatmfl-wJULhpjL-py3.10/lib/python3.10/site-packages/langchain/llms/openai.py:169): UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain.chat_models import ChatOpenAI`
warnings.warn(
[/Users/nick/Library/Caches/pypoetry/virtualenvs/chatmfl-wJULhpjL-py3.10/lib/python3.10/site-packages/langchain/llms/openai.py:608](https://file+.vscode-resource.vscode-cdn.net/Users/nick/Library/Caches/pypoetry/virtualenvs/chatmfl-wJULhpjL-py3.10/lib/python3.10/site-packages/langchain/llms/openai.py:608): UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain.chat_models import ChatOpenAI`
warnings.warn(
llm = ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo')
As a novice, I wonder how to pass vectorstore into this function
Try using
from langchain.chat_models import ChatOpenAI llm = ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo')
How does memory work with the ChatOpenAI constructor? With the LLMChain, it was a constructor parameter:
chatgpt_chain = LLMChain(
llm=OpenAI(temperature=0, model_name="gpt-3.5-turbo", streaming=False, callback_manager=callback_manager),
prompt=prompt,
verbose=True,
memory=memory,
callback_manager=callback_manager
)
EDIT: Just use ChatOpenAI in lieu of OpenAI in the LLM Chain constructor
Why is the chat model warning set to go off for gpt-4? Isn't GPT-4 a completion model?
I was under the impression that GPT-4 was a text completion model.
As this issue is still open: gpt-3.5-turbo is a Chat model, not a completion model. Same for gpt-4. OpenAI llm is a completion model, not a Chat model, that's why you get this warning. The currently most advanced available completion model is text-davinci-003.
llm = OpenAI(model_name="text-davinci-003")
The current documentation https://langchain.readthedocs.io/en/latest/modules/agents/getting_started.html seems to not be up to date with version 0.0.117:
UserWarning: You are trying to use a chat model. This way of initializing it is no longer supported. Instead, please use: `from langchain.chat_models import ChatOpenAI`EDIT: to be more precise, this only happens if I try to change model_name to "gpt-3.5-turbo"
I used that i still have the same issue that it's not resolved
I'm currently getting this issue:
And the python script was working correctly literally two days ago. My importing is like this:
from gpt_index import SimpleDirectoryReader, GPTListIndex, GPTSimpleVectorIndex,LLMPredictor, PromptHelper
from langchain.chat_models import ChatOpenAI
import gradio as gr
import sys, time
import os
In the current documentation it doesn't seem to be a new way of importing.
Hi, @tomsib2001,
I'm helping the LangChain team manage their backlog and am marking this issue as stale. It seems that the issue you reported is related to outdated documentation triggering a UserWarning when changing the model_name to "gpt-3.5-turbo". There have been discussions and suggestions from other users on potential workarounds, but the current status of the issue is unresolved.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, kindly let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and contribution to LangChain!