letta
letta copied to clipboard
cannot import name 'Document' from 'llama_index.core' (unknown location)
Describe the bug When I want to run "memgpt run" or "memgpt configure" on my Windows 10 X64 machine, using python 3.11.7 I get the following error:
memgpt run Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "C:\Program Files\Python311\Scripts\memgpt.exe\__main__.py", line 4, in <module> File "C:\Program Files\Python311\Lib\site-packages\memgpt\__init__.py", line 3, in <module> from memgpt.client.client import create_client File "C:\Program Files\Python311\Lib\site-packages\memgpt\client\client.py", line 8, in <module> from memgpt.cli.cli import QuickstartChoice File "C:\Program Files\Python311\Lib\site-packages\memgpt\cli\cli.py", line 26, in <module> from memgpt.agent import Agent, save_agent File "C:\Program Files\Python311\Lib\site-packages\memgpt\agent.py", line 15, in <module> from memgpt.persistence_manager import LocalStateManager File "C:\Program Files\Python311\Lib\site-packages\memgpt\persistence_manager.py", line 5, in <module> from memgpt.memory import ( File "C:\Program Files\Python311\Lib\site-packages\memgpt\memory.py", line 10, in <module> from memgpt.embeddings import embedding_model, query_embedding, parse_and_chunk_text File "C:\Program Files\Python311\Lib\site-packages\memgpt\embeddings.py", line 14, in <module> from llama_index.core import Document as LlamaIndexDocument ImportError: cannot import name 'Document' from 'llama_index.core' (unknown location)
Please describe your setup
- [ ] How did you install memgpt?
-
pip install pymemgpt
andpip install pymemgpt[postgres]
-
- [ ] Describe your setup
- What's your OS (Windows/MacOS/Linux)? Windows 10 X64
- How are you running
memgpt
?cmd.exe
MemGPT Config
Please attach your ~/.memgpt/config
file or copy past it below.
I am using OpenAI
getting same in linux (debian) pip install pymemgpt -U
pymemgpt v0.3.4
did the path change in the install ? ie: /root/.memgpt
I have just installed the package using the command pip install -U pymemgpt
and I am getting the same error.
same here. I install with pip -e. then I update llama-index to 0.1-14, same error before and after the update
Had to make following changes and install from source i.e pip install -e "."
diff --git a/memgpt/data_sources/connectors.py b/memgpt/data_sources/connectors.py index 00d93d8..fad929f 100644 --- a/memgpt/data_sources/connectors.py +++ b/memgpt/data_sources/connectors.py @@ -6,7 +6,7 @@ from memgpt.data_types import Document, Passage
from typing import List, Iterator, Dict, Tuple, Optional import typer -from llama_index.core import Document as LlamaIndexDocument +from llama_index.core.schema import Document as LlamaIndexDocument
class DataConnector: @@ -103,7 +103,7 @@ class DirectoryConnector(DataConnector): assert self.input_directory is not None, "Must provide input directory if recursive is True."
def generate_documents(self) -> Iterator[Tuple[str, Dict]]: # -> Iterator[Document]:
- from llama_index.core import SimpleDirectoryReader + from llama_index.core.readers import SimpleDirectoryReader
if self.input_directory is not None:
reader = SimpleDirectoryReader(
diff --git a/memgpt/embeddings.py b/memgpt/embeddings.py index ee7d4b8..5792641 100644 --- a/memgpt/embeddings.py +++ b/memgpt/embeddings.py @@ -11,7 +11,7 @@ from memgpt.constants import MAX_EMBEDDING_DIM, EMBEDDING_TO_TOKENIZER_MAP, EMBE
# from llama_index.core.base.embeddings import BaseEmbedding from llama_index.core.node_parser import SentenceSplitter -from llama_index.core import Document as LlamaIndexDocument +from llama_index.core.schema import Document as LlamaIndexDocument
# from llama_index.core.base.embeddings import BaseEmbedding # from llama_index.core.embeddings import BaseEmbedding
Is this still an issue on the latest version?
I am still facing this. Just updated from version 0.3.0 to 0.3.14
I can confirm that this is still an issue on latest via pypy.
Installing from repo, pip install --upgrade --no-cache-dir --force-reinstall git+https://github.com/cpacker/MemGPT.git
results in:
Traceback (most recent call last):
File "C:\Program Files\Python310\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Program Files\Python310\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\Scripts\memgpt.exe\__main__.py", line 7, in <module>
sys.exit(app())
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\typer\main.py", line 328, in __call__
raise e
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\typer\main.py", line 311, in __call__
return get_command(self)(*args, **kwargs)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\click\core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\typer\core.py", line 783, in main
return _main(
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\typer\core.py", line 225, in _main
rv = self.invoke(ctx)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\click\core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\typer\main.py", line 683, in wrapper
return callback(**use_params) # type: ignore
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\cli\cli.py", line 651, in run
memgpt_agent = Agent(
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\agent.py", line 282, in __init__
self.persistence_manager = LocalStateManager(agent_state=self.agent_state)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\persistence_manager.py", line 51, in __init__
self.archival_memory = EmbeddingArchivalMemory(agent_state)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\memory.py", line 385, in __init__
self.embed_model = embedding_model(agent_state.embedding_config)
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\embeddings.py", line 205, in embedding_model
return default_embedding_model()
File "C:\Users\pedr0\AppData\Roaming\Python\Python310\site-packages\memgpt\embeddings.py", line 141, in default_embedding_model
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
ModuleNotFoundError: No module named 'llama_index.embeddings.huggingface'
Fixed by installing llama-index-embeddings-huggingface