langchain icon indicating copy to clipboard operation
langchain copied to clipboard

[BUG]'gpt-3.5-turbo' does not in assertion list

Open hezhefly opened this issue 1 year ago β€’ 1 comments

when I save llm=OpenAI(temperature=0, model_name="gpt-3.5-turbo"),the json data like this:

"llm": {
        "model_name": "gpt-3.5-turbo",
        "temperature": 0,
        "_type": "openai-chat"
    },

but the _type is not in type assertion list, and raise error:

File ~/miniconda3/envs/gpt/lib/python3.10/site-packages/langchain/llms/loading.py:19, in load_llm_from_config(config)
     16 config_type = config.pop("_type")
     18 if config_type not in type_to_cls_dict:
---> 19     raise ValueError(f"Loading {config_type} LLM not supported")
     21 llm_cls = type_to_cls_dict[config_type]
     22 return llm_cls(**config)

ValueError: Loading openai-chat LLM not supported

hezhefly avatar Apr 09 '23 16:04 hezhefly

langchain does not support loading LLM of type openai-chat, which is the default type if you use gpt-3.5 or gpt-4 model family with langchain.llms.OpenAI. This is intentional as the devs are deprecating that type in favour of langchain.chat_models.ChatOpenAI, as shown in #1715. You are encouraged to use ChatOpenAI class if you are using gpt-3.5 and gpt-4 model family, like so:

from langchain.chat_models import ChatOpenAI

chat = OpenAI(model_name="gpt-3.5-turbo", temperature=0)

Unfortunately, I am not aware of any functions that allow you to save a chat model.

outday29 avatar Apr 16 '23 08:04 outday29

We get the same error when using ChatOpenAI function to create the LLM (Langchain version v0.0.196):

chat = ChatOpenAI(temperature=0, openai_api_key=openai_api_key) chain = LLMChain(llm=chat, prompt=chat_prompt)

On save, which raises no error, the JSON that is exported has the following for the llm portion:

"llm": { "model_name": "gpt-3.5-turbo", "model": "gpt-3.5-turbo", "request_timeout": null, "max_tokens": null, "stream": false, "n": 1, "temperature": 0.0, "_type": "openai-chat" },

in which _type is still set to "openai-chat". On load, it raises the error:

ValueError: Loading openai-chat LLM not supported

This is not created using the soon-to-be-deprecated construct, though. Is there a bug in the type assignment to the Chat objects created using the new ChatOpenAI construct?

aiscience01 avatar Jun 11 '23 12:06 aiscience01

On a related note, save_agent does not work either. It raises an error when trying to save agents (save_agents()). In this case, there is no specific message, except:

ValueError:

aiscience01 avatar Jun 12 '23 12:06 aiscience01

Related: https://github.com/hwchase17/langchain/pull/1715

shivamMg avatar Jul 18 '23 05:07 shivamMg

Hi, @hezhefly! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you opened an issue titled "[BUG]'gpt-3.5-turbo' does not in assertion list" which is about an error that occurs when trying to save a model with the name "gpt-3.5-turbo". The error is raised because the "_type" attribute is not in the type assertion list. outday29 suggested using the ChatOpenAI class instead of langchain.llms.OpenAI to avoid the error. aiscience01 also reported a related error when using ChatOpenAI and save_agent. shivamMg provided a link to a related pull request.

If this issue is still relevant to the latest version of the LangChain repository, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your contribution to the LangChain repository!

dosubot[bot] avatar Oct 17 '23 16:10 dosubot[bot]

@shivamMg @aiscience01 Was there a workaround for this problem? I am trying to load RetrievalQA but still not working :(

kyutcho avatar Oct 24 '23 18:10 kyutcho

sorry, been a while since I worked on this.

https://github.com/langchain-ai/langchain/pull/8164 - this snippet didn't work? image

if not, then a hacky solution is to monkeypatch the lib: https://github.com/langchain-ai/langchain/pull/1715/files

something like:

from lanchain.llms import type_to_cls_dict

type_to_cls_dict["openai-chat"] = OpenAIChat
# similarly for other chat models like AzureOpenAIChat etc.

shivamMg avatar Oct 25 '23 10:10 shivamMg

Thanks for the comment. The problem comes from using mlflow when loading the model. Hacky solution works for now but not ideal for deployment purposes

kyutcho avatar Oct 25 '23 19:10 kyutcho

I'm still seeing this working with mlflow

mabreuortega avatar Dec 14 '23 19:12 mabreuortega