langchain
langchain copied to clipboard
Langchain Import Issue
System Info
Python 3.10.8 Langchain==0.0.229
AWS Sagemaker Studio w/ PyTorch 2.0.0 Python 3.10 GPU Optimized image
Who can help?
@hwchase17 or @agola11
Information
- [ ] The official example notebooks/scripts
- [X] My own modified scripts
Related Components
- [ ] LLMs/Chat Models
- [ ] Embedding Models
- [ ] Prompts / Prompt Templates / Prompt Selectors
- [ ] Output Parsers
- [X] Document Loaders
- [ ] Vector Stores / Retrievers
- [ ] Memory
- [ ] Agents / Agent Executors
- [ ] Tools / Toolkits
- [ ] Chains
- [ ] Callbacks/Tracing
- [ ] Async
Reproduction
Was working fine in a Jupyter Notebook in AWS Sagemaker Studio for the past few weeks but today running into an issue with no code changes... import chain issue?
!pip install langchain openai chromadb tiktoken pypdf unstructured pdf2image;
from langchain.document_loaders import TextLoader
Results in:
TypeError Traceback (most recent call last)
Cell In[10], line 1
----> 1 from langchain.document_loaders import TextLoader
2 docLoader = TextLoader('./docs/nlitest.txt', encoding='utf8')
3 document = docLoader.load()
File /opt/conda/lib/python3.10/site-packages/langchain/__init__.py:6
3 from importlib import metadata
4 from typing import Optional
----> 6 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
7 from langchain.cache import BaseCache
8 from langchain.chains import (
9 ConversationChain,
10 LLMBashChain,
(...)
18 VectorDBQAWithSourcesChain,
19 )
File /opt/conda/lib/python3.10/site-packages/langchain/agents/__init__.py:2
1 """Interface for agents."""
----> 2 from langchain.agents.agent import (
3 Agent,
4 AgentExecutor,
5 AgentOutputParser,
6 BaseMultiActionAgent,
7 BaseSingleActionAgent,
8 LLMSingleActionAgent,
9 )
10 from langchain.agents.agent_toolkits import (
11 create_csv_agent,
12 create_json_agent,
(...)
21 create_vectorstore_router_agent,
22 )
23 from langchain.agents.agent_types import AgentType
File /opt/conda/lib/python3.10/site-packages/langchain/agents/agent.py:25
17 from langchain.callbacks.base import BaseCallbackManager
18 from langchain.callbacks.manager import (
19 AsyncCallbackManagerForChainRun,
20 AsyncCallbackManagerForToolRun,
(...)
23 Callbacks,
24 )
---> 25 from langchain.chains.base import Chain
26 from langchain.chains.llm import LLMChain
27 from langchain.input import get_color_mapping
File /opt/conda/lib/python3.10/site-packages/langchain/chains/__init__.py:3
1 """Chains are easily reusable components which can be linked together."""
2 from langchain.chains.api.base import APIChain
----> 3 from langchain.chains.api.openapi.chain import OpenAPIEndpointChain
4 from langchain.chains.combine_documents.base import AnalyzeDocumentChain
5 from langchain.chains.combine_documents.map_reduce import MapReduceDocumentsChain
File /opt/conda/lib/python3.10/site-packages/langchain/chains/api/openapi/chain.py:17
15 from langchain.requests import Requests
16 from langchain.schema.language_model import BaseLanguageModel
---> 17 from langchain.tools.openapi.utils.api_models import APIOperation
20 class _ParamMapping(NamedTuple):
21 """Mapping from parameter name to parameter value."""
File /opt/conda/lib/python3.10/site-packages/langchain/tools/__init__.py:11
4 from langchain.tools.azure_cognitive_services import (
5 AzureCogsFormRecognizerTool,
6 AzureCogsImageAnalysisTool,
7 AzureCogsSpeech2TextTool,
8 AzureCogsText2SpeechTool,
9 )
10 from langchain.tools.base import BaseTool, StructuredTool, Tool, tool
---> 11 from langchain.tools.bing_search.tool import BingSearchResults, BingSearchRun
12 from langchain.tools.brave_search.tool import BraveSearch
13 from langchain.tools.convert_to_openai import format_tool_to_openai_function
File /opt/conda/lib/python3.10/site-packages/langchain/tools/bing_search/__init__.py:3
1 """Bing Search API toolkit."""
----> 3 from langchain.tools.bing_search.tool import BingSearchResults, BingSearchRun
5 __all__ = ["BingSearchRun", "BingSearchResults"]
File /opt/conda/lib/python3.10/site-packages/langchain/tools/bing_search/tool.py:10
5 from langchain.callbacks.manager import (
6 AsyncCallbackManagerForToolRun,
7 CallbackManagerForToolRun,
8 )
9 from langchain.tools.base import BaseTool
---> 10 from langchain.utilities.bing_search import BingSearchAPIWrapper
13 class BingSearchRun(BaseTool):
14 """Tool that adds the capability to query the Bing search API."""
File /opt/conda/lib/python3.10/site-packages/langchain/utilities/__init__.py:3
1 """General utilities."""
2 from langchain.requests import TextRequestsWrapper
----> 3 from langchain.utilities.apify import ApifyWrapper
4 from langchain.utilities.arxiv import ArxivAPIWrapper
5 from langchain.utilities.awslambda import LambdaWrapper
File /opt/conda/lib/python3.10/site-packages/langchain/utilities/apify.py:5
1 from typing import Any, Callable, Dict, Optional
3 from pydantic import BaseModel, root_validator
----> 5 from langchain.document_loaders import ApifyDatasetLoader
6 from langchain.document_loaders.base import Document
7 from langchain.utils import get_from_dict_or_env
File /opt/conda/lib/python3.10/site-packages/langchain/document_loaders/__init__.py:44
39 from langchain.document_loaders.duckdb_loader import DuckDBLoader
40 from langchain.document_loaders.email import (
41 OutlookMessageLoader,
42 UnstructuredEmailLoader,
43 )
---> 44 from langchain.document_loaders.embaas import EmbaasBlobLoader, EmbaasLoader
45 from langchain.document_loaders.epub import UnstructuredEPubLoader
46 from langchain.document_loaders.evernote import EverNoteLoader
File /opt/conda/lib/python3.10/site-packages/langchain/document_loaders/embaas.py:54
50 bytes: str
51 """The base64 encoded bytes of the document to extract text from."""
---> 54 class BaseEmbaasLoader(BaseModel):
55 """Base class for embedding a model into an Embaas document extraction API."""
57 embaas_api_key: Optional[str] = None
File /opt/conda/lib/python3.10/site-packages/pydantic/main.py:204, in pydantic.main.ModelMetaclass.__new__()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:488, in pydantic.fields.ModelField.infer()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:419, in pydantic.fields.ModelField.__init__()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:539, in pydantic.fields.ModelField.prepare()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:801, in pydantic.fields.ModelField.populate_validators()
File /opt/conda/lib/python3.10/site-packages/pydantic/validators.py:696, in find_validators()
File /opt/conda/lib/python3.10/site-packages/pydantic/validators.py:585, in pydantic.validators.make_typeddict_validator()
File /opt/conda/lib/python3.10/site-packages/pydantic/annotated_types.py:35, in pydantic.annotated_types.create_model_from_typeddict()
File /opt/conda/lib/python3.10/site-packages/pydantic/main.py:972, in pydantic.main.create_model()
File /opt/conda/lib/python3.10/site-packages/pydantic/main.py:204, in pydantic.main.ModelMetaclass.__new__()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:488, in pydantic.fields.ModelField.infer()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:419, in pydantic.fields.ModelField.__init__()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:534, in pydantic.fields.ModelField.prepare()
File /opt/conda/lib/python3.10/site-packages/pydantic/fields.py:638, in pydantic.fields.ModelField._type_analysis()
File /opt/conda/lib/python3.10/typing.py:1158, in _SpecialGenericAlias.__subclasscheck__(self, cls)
1156 return issubclass(cls.__origin__, self.__origin__)
1157 if not isinstance(cls, _GenericAlias):
-> 1158 return issubclass(cls, self.__origin__)
1159 return super().__subclasscheck__(cls)
TypeError: issubclass() arg 1 must be a class ```
### Expected behavior
The module should import with no error.
I saw 2 similar issues on Discord, Langchain was working for a few weeks and then today is presenting the above error.
I am also experiencing the above error
Answer generated by a 🤖
Answer
I found related issues that might be helpful. I did my best to summarize the solutions, but I recommend looking at the related issues yourself.
Closed Issues
ImportError: cannot import name 'LLMChain' from partially initialized module 'langchain'
The suggested solution is: Rename the file from requests.py
to requests_chain.py
to resolve the issue.
Unable to import from langchain.document_loaders
No solution found.
Open Issues
Issue: Can not import the Langchain modules.
The suggested solution is: The user should upgrade their Python version to 3.9 and then install LangChain version 0.0.219
. The issue seems to be due to a version mismatch in the Pydantic module.
ERROR-Import Langchain : TypeError: dataclass_transform() got an unexpected keyword argument 'field_specifiers'
The suggested solution is:
Create a new environment with Python version 3.9.12. Then install langchain (pip install langchain) and chromadb (pip install chromadb). This should resolve the issue.
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
@luke-j0hnson
Can you run pip uninstall -y langchain
and then reinstall and see if your issue persists? It seemed to work for me...
Edit: Worked for 1 notebook, but not the other... hmm
Didn’t solve the issue in my notebook either. Code has worked for weeks stopped working tonight with the same error
Same here, using latest Python (3.11.4) in Docker environment and Langchain 0.0.229 (latest in PyPI). Tried to upgrade Pydantic to the latest version, as suggested here, but I've got this error:
The conflict is caused by:
The user requested pydantic~=2.0.2
langchain 0.0.229 depends on pydantic<2 and >=1
Then I've tried to upgrade to 1.10.11, but I use ChromaDB (0.3.27) which depends on version 1.9 exclusively.
Also, manually installing typing-inspect
and typing_extensions
as it was also suggested in the linked StackOverflow question didn't help either.
@luke-j0hnson Can you duplicate the notebook and try and run it again?
Same here, using latest Python (3.11.4) in Docker environment and Langchain 0.0.229 (latest in PyPI). Tried to upgrade Pydantic to the latest version, as suggested here, but I've got this error:
The conflict is caused by: The user requested pydantic~=2.0.2 langchain 0.0.229 depends on pydantic<2 and >=1
Then I've tried to upgrade to 1.10.11, but I use ChromaDB which depends on version 1.9 exclusively. Also, manually installing
typing-inspect
andtyping_extensions
as it was also suggested in the linked StackOverflow question didn't help either.
Is your development environment a Jupyter Notebook?
No, it's a standard one, sorry that I forgot to mention that only difference.
I'm not sure why my duplicated notebook is working now and hasn't been changed at all...
Try using chromadb==0.3.26 There is a problem with Chroma 0.3.27 and Pydantic. https://github.com/chroma-core/chroma/issues/785
I had the same issue and I solved it installing chromadb==0.3.26 and chromadb==0.3.26 versions.
pip install pydantic==1.10.8 working
I've tried with ChromaDB 0.3.26 as suggested but that didn't work either :( Seems like Pydantic needs to be upgraded too, but ChromaDB requirement prevents it to be upgraded to the version greater than 1.9.
Update: Putting pydantic==1.10.8
after chromadb==0.3.26
in requirements.txt did the trick, thanks!
embedchain is also facing some issue due to dependency on chroma. Fixed by locking the package version in setup.py for now: https://github.com/embedchain/embedchain/commit/34fd4180ec3d05de16046ad97d3edb105bbfddf1
FWIW, following directions above of @lukafilipxvic and @softzer0, my below is working, and my three-day nightmare is over.
[tool.poetry.dependencies]
python = "^3.10"
pydantic = "1.10.8"
chromadb = "0.3.26"
#chromadb = ">=0.3.27"
#pydantic = "1.9.0"
langchain = "^0.0.231"
llama-index = "^0.7.5"
openai = "^0.27.8"
ipykernel = "^6.24.0"
python-dotenv = "^1.0.0"
believe this is same as #7548
Hi, @Fulladorn! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
Based on my understanding, the issue you reported is related to the module langchain.document_loaders
not being importable due to a TypeError
. There have been some suggestions in the comments, such as renaming a file, upgrading Python, and installing a specific version of LangChain. Additionally, there is a discussion about conflicts with Pydantic and ChromaDB versions.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project!
Please assist me for this problem
ImportError: cannot import name 'Ollama' from 'langchain.llms'
More details the problem also written in this way
"from langchain.llms import Ollama ImportError: cannot import name 'Ollama' from 'langchain.llms' (/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/init.py)"
use 'langchain_community.llms' to import Ollama
from langchain_community.llms import Ollama
note that langchain is not supporting python version 3.8 below try to use 3.8+