langchain
langchain copied to clipboard
[BUG]'gpt-3.5-turbo' does not in assertion list
when I save llm=OpenAI(temperature=0, model_name="gpt-3.5-turbo")
οΌthe json data like this:
"llm": {
"model_name": "gpt-3.5-turbo",
"temperature": 0,
"_type": "openai-chat"
},
but the _type is not in type assertion list, and raise error:
File ~/miniconda3/envs/gpt/lib/python3.10/site-packages/langchain/llms/loading.py:19, in load_llm_from_config(config)
16 config_type = config.pop("_type")
18 if config_type not in type_to_cls_dict:
---> 19 raise ValueError(f"Loading {config_type} LLM not supported")
21 llm_cls = type_to_cls_dict[config_type]
22 return llm_cls(**config)
ValueError: Loading openai-chat LLM not supported
langchain
does not support loading LLM of type openai-chat
, which is the default type if you use gpt-3.5
or gpt-4
model family with langchain.llms.OpenAI
. This is intentional as the devs are deprecating that type in favour of langchain.chat_models.ChatOpenAI
, as shown in #1715. You are encouraged to use ChatOpenAI
class if you are using gpt-3.5
and gpt-4
model family, like so:
from langchain.chat_models import ChatOpenAI
chat = OpenAI(model_name="gpt-3.5-turbo", temperature=0)
Unfortunately, I am not aware of any functions that allow you to save a chat model.
We get the same error when using ChatOpenAI function to create the LLM (Langchain version v0.0.196):
chat = ChatOpenAI(temperature=0, openai_api_key=openai_api_key) chain = LLMChain(llm=chat, prompt=chat_prompt)
On save, which raises no error, the JSON that is exported has the following for the llm portion:
"llm": { "model_name": "gpt-3.5-turbo", "model": "gpt-3.5-turbo", "request_timeout": null, "max_tokens": null, "stream": false, "n": 1, "temperature": 0.0, "_type": "openai-chat" },
in which _type is still set to "openai-chat". On load, it raises the error:
ValueError: Loading openai-chat LLM not supported
This is not created using the soon-to-be-deprecated construct, though. Is there a bug in the type assignment to the Chat objects created using the new ChatOpenAI construct?
On a related note, save_agent does not work either. It raises an error when trying to save agents (save_agents()). In this case, there is no specific message, except:
ValueError:
Related: https://github.com/hwchase17/langchain/pull/1715
Hi, @hezhefly! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you opened an issue titled "[BUG]'gpt-3.5-turbo' does not in assertion list" which is about an error that occurs when trying to save a model with the name "gpt-3.5-turbo". The error is raised because the "_type" attribute is not in the type assertion list. outday29 suggested using the ChatOpenAI
class instead of langchain.llms.OpenAI
to avoid the error. aiscience01 also reported a related error when using ChatOpenAI
and save_agent. shivamMg provided a link to a related pull request.
If this issue is still relevant to the latest version of the LangChain repository, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your contribution to the LangChain repository!
@shivamMg @aiscience01 Was there a workaround for this problem? I am trying to load RetrievalQA but still not working :(
sorry, been a while since I worked on this.
https://github.com/langchain-ai/langchain/pull/8164 - this snippet didn't work?
if not, then a hacky solution is to monkeypatch the lib: https://github.com/langchain-ai/langchain/pull/1715/files
something like:
from lanchain.llms import type_to_cls_dict
type_to_cls_dict["openai-chat"] = OpenAIChat
# similarly for other chat models like AzureOpenAIChat etc.
Thanks for the comment. The problem comes from using mlflow when loading the model. Hacky solution works for now but not ideal for deployment purposes
I'm still seeing this working with mlflow