langchain
langchain copied to clipboard
ERROR-Import Langchain : TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'
In databricks I cant able to import langchain.
TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'
Thanks in advance.
same for me
In databricks, the error disappears with version langchain==0.0.125
It works with the version 0.0.125.
pip install langchain==0.0.125
Thanks @samlopezruiz for sharing this solution.
I let this issue open as it works with older version.But still issue with new version.
I have the same issue and pip install langchain==0.0.125 did not resolve it
TypeError Traceback (most recent call last) Cell In[1], line 1 ----> 1 from langchain.text_splitter import CharacterTextSplitter
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/init.py:5 1 """Main entrypoint into package.""" 3 from typing import Optional ----> 5 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain 6 from langchain.cache import BaseCache 7 from langchain.callbacks import ( 8 set_default_callback_manager, 9 set_handler, 10 set_tracing_callback_manager, 11 )
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/agents/init.py:2 1 """Interface for agents.""" ----> 2 from langchain.agents.agent import Agent, AgentExecutor 3 from langchain.agents.agent_toolkits import ( 4 create_csv_agent, 5 create_json_agent, (...) 10 create_vectorstore_router_agent, 11 ) 12 from langchain.agents.conversational.base import ConversationalAgent
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/agents/agent.py:11 8 from typing import Any, Dict, List, Optional, Sequence, Tuple, Union 10 import yaml ---> 11 from pydantic import BaseModel, root_validator 13 from langchain.agents.tools import InvalidTool 14 from langchain.callbacks.base import BaseCallbackManager
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/init.py:2, in init pydantic.init()
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/dataclasses.py:48, in init pydantic.dataclasses()
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/main.py:120, in init pydantic.main()
TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'
I am also facing same issue tried pip install langchain, pip install -q langchain, none of them are working getting TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'.
pip install langchain==0.0.125 using this I am getting error
Hi @Abonia1 @zinniz @jigsawmetric Would you mind specifying which Databricks Runtime version are you using? I tested with DBR 12.2 LTS
and DBR 13.0 ML
with langchain==0.0.125
and 0.0.173
, and all work well. I'm happy to take a closer look once you share more info.
This probably is not langchain issue, but it is issue of typing-extensions
incompatible with pydantic
Could you try to install langchain 0.0.173
version, then install typing-extensions>=4.20
?
@WeichenXu123 Do you mean typing-extensions>=4.2.0
?
%pip install langchain typing_extensions==4.1.0
...
pydantic 1.10.6 requires typing-extensions>=4.2.0, but you have typing-extensions 4.1.0 which is incompatible.
We are able to reproduce the issue with typing_extensions==4.1.0
.
sorry. yes we should install typing-extensions>=4.2.0
to address the issue.
As @WeichenXu123 and @liangz1 mentioned, we believe the root cause is typing-extensions<4.2
(see https://github.com/pydantic/pydantic/issues/4885). We can reproduce the exact same error with the following code on DBR 12.2 LTS:
%pip install langchain
%pip install typing_extensions==4.1.0
import langchain
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
File <command-3996646595503935>:1
----> 1 import langchain
File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)
166 thread_local._nest_level += 1
168 try:
169 # Import the desired module. If you’re seeing this while debugging a failed import,
170 # look at preceding stack frames for relevant error information.
--> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level)
173 is_root_import = thread_local._nest_level == 1
174 # `level` represents the number of leading dots in a relative import statement.
175 # If it's zero, then this is an absolute import.
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/langchain/__init__.py:6
3 from importlib import metadata
4 from typing import Optional
----> 6 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
7 from langchain.cache import BaseCache
8 from langchain.chains import (
9 ConversationChain,
10 LLMBashChain,
(...)
18 VectorDBQAWithSourcesChain,
19 )
File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)
166 thread_local._nest_level += 1
168 try:
169 # Import the desired module. If you’re seeing this while debugging a failed import,
170 # look at preceding stack frames for relevant error information.
--> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level)
173 is_root_import = thread_local._nest_level == 1
174 # `level` represents the number of leading dots in a relative import statement.
175 # If it's zero, then this is an absolute import.
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/langchain/agents/__init__.py:2
1 """Interface for agents."""
----> 2 from langchain.agents.agent import (
3 Agent,
4 AgentExecutor,
5 AgentOutputParser,
6 BaseMultiActionAgent,
7 BaseSingleActionAgent,
8 LLMSingleActionAgent,
9 )
10 from langchain.agents.agent_toolkits import (
11 create_csv_agent,
12 create_json_agent,
(...)
20 create_vectorstore_router_agent,
21 )
22 from langchain.agents.agent_types import AgentType
File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)
166 thread_local._nest_level += 1
168 try:
169 # Import the desired module. If you’re seeing this while debugging a failed import,
170 # look at preceding stack frames for relevant error information.
--> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level)
173 is_root_import = thread_local._nest_level == 1
174 # `level` represents the number of leading dots in a relative import statement.
175 # If it's zero, then this is an absolute import.
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/langchain/agents/agent.py:13
10 from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
12 import yaml
---> 13 from pydantic import BaseModel, root_validator
15 from langchain.agents.agent_types import AgentType
16 from langchain.agents.tools import InvalidTool
File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)
166 thread_local._nest_level += 1
168 try:
169 # Import the desired module. If you’re seeing this while debugging a failed import,
170 # look at preceding stack frames for relevant error information.
--> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level)
173 is_root_import = thread_local._nest_level == 1
174 # `level` represents the number of leading dots in a relative import statement.
175 # If it's zero, then this is an absolute import.
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/pydantic/__init__.py:2, in init pydantic.__init__()
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/pydantic/dataclasses.py:48, in init pydantic.dataclasses()
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-e1378973-d308-48c5-9b29-991a3ab4117c/lib/python3.9/site-packages/pydantic/main.py:120, in init pydantic.main()
TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'
The remaining question is why and how typing-extensions<4.2
is installed.
This can happen when:
-
pip install typing-extensions<4.2
is executed. -
pip install <package>
where<package>
requirestyping-extensions<4.2
, is executed.
I found typing-extensions==4.1.1
is preinstalled in DBR 12.2 LTS. This may be related.
Shall we upgrade typing-extensions
to >=4.2 on databricks runtime ?
@WeichenXu123 We don't need to.
@Abonia1 How did you install lanhchain
? Did you run %pip install langchain
on your notebook?
@harupy @WeichenXu123 Finally I can install and import recent version of langchain in databricks with DBR 12.2 LTS runtime.
Just do:
%pip install langchain
Thanks for all the support.
I still had problems using typing-extensions>=4.2.0. However, pinning to 4.5.0 worked.
typing-extensions==4.5.0
i'm also getting this error and tried the solutions above to no avail
using python 3.9 and langchain 0.0.189
Error for context:
TypeError Traceback (most recent call last)
Input In [28], in <cell line: 2>()
1 import openai
----> 2 from langchain.chat_models import ChatOpenAI
3 from langchain import LLMChain
4 from langchain.prompts.chat import (
5 ChatPromptTemplate,
6 SystemMessagePromptTemplate,
7 HumanMessagePromptTemplate,
8 )
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/__init__.py:6, in <module>
3 from importlib import metadata
4 from typing import Optional
----> 6 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
7 from langchain.cache import BaseCache
8 from langchain.chains import (
9 ConversationChain,
10 LLMBashChain,
(...)
18 VectorDBQAWithSourcesChain,
19 )
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/agents/__init__.py:2, in <module>
1 """Interface for agents."""
----> 2 from langchain.agents.agent import (
3 Agent,
4 AgentExecutor,
5 AgentOutputParser,
6 BaseMultiActionAgent,
7 BaseSingleActionAgent,
8 LLMSingleActionAgent,
9 )
10 from langchain.agents.agent_toolkits import (
11 create_csv_agent,
12 create_json_agent,
(...)
21 create_vectorstore_router_agent,
22 )
23 from langchain.agents.agent_types import AgentType
File ~/opt/anaconda3/lib/python3.9/site-packages/langchain/agents/agent.py:13, in <module>
10 from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
12 import yaml
---> 13 from pydantic import BaseModel, root_validator
15 from langchain.agents.agent_types import AgentType
16 from langchain.agents.tools import InvalidTool
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/__init__.py:2, in init pydantic.__init__()
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/dataclasses.py:48, in init pydantic.dataclasses()
File ~/opt/anaconda3/lib/python3.9/site-packages/pydantic/main.py:120, in init pydantic.main()
TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'```
langchain== 0.0.219, typing-extensions==4.2.0 works for me
when I do typing-extensions==4.2.0, got an error that chromadb requires >4.7. So currently, I get the same error with langchain == 0.0.219 , chromadb==0.3.26, typing-extensions == 4.7.1 in miniconda python 3.8 environment.
Update: Created a new environment with Python version 3.9.12.
then installed langchain(pip install langchain) and chromadb (pip install chromadb), and no errors with the following line (which was the issue before):
from langchain.vectorstores import Chroma
Hi @Abonia1 @zinniz @jigsawmetric Would you mind specifying which Databricks Runtime version are you using? I tested with
DBR 12.2 LTS
andDBR 13.0 ML
withlangchain==0.0.125
and0.0.173
, and all work well. I'm happy to take a closer look once you share more info.
Hi @liangz1 My Databricks Runtime version is 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12) with langchain==0.0.125 and 0.0.235 and I have encountered the same error. Could you please take a look? Thank you in advance.
Hi @Feya , would you mind checking if there is any error from the %pip install langchain
command, and resolving the conflicting dependencies (typing-extensions)?
Initially facing this issue, upgraded to the Databricks runtime version to 13.2 and it is working.
I still had problems using typing-extensions>=4.2.0. However, pinning to 4.5.0 worked.
typing-extensions==4.5.0
This worked for me with langchain==0.0.248
use databricks runtime version as 13.0 while creating the cluster. Hope it helps.
Hi, @Abonia1. I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you encountered a TypeError
when trying to import langchain
in Databricks. It seems that the issue has been resolved by either installing langchain==0.0.125
or upgrading typing-extensions
to >=4.2.0
or pinning it to 4.5.0
. The maintainers also suggested checking the Databricks Runtime version and resolving any conflicting dependencies.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project. Let us know if you have any further questions or concerns.