mem0 icon indicating copy to clipboard operation
mem0 copied to clipboard

ValueError for Shapes when using Neo4j.

Open parshvadaftari opened this issue 9 months ago • 6 comments

🐛 Describe the bug

When using only the Neo4j as graph_store and setting the embeddings dimension to something less than 1534 gives the below error. Also gives the deprecation warning for the langchain-neo4j package.

import os
from mem0 import Memory

config = {
    "embedder": {
        "provider": "openai",
        "config": {
            "api_key": os.getenv("OPENAI_API_KEY"),
            "model": "text-embedding-3-small",
            "embedding_dims": 384,
        },
    },
    "graph_store": {
        "provider": "neo4j",
        "config": {
            "url": "neo4j+s://xxxxxxx.databases.neo4j.io",
            "username": "neo4j",
            "password": "xxxxxxxxxxx"
        }
    }
}


m = Memory.from_config(config)
messages = [
    {"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"},
    {"role": "assistant", "content": "How about a thriller movie? They can be quite engaging."},
    {"role": "user", "content": "I’m not a big fan of thriller movies but I love sci-fi movies."},
    {"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}
]
add_result = m.add(messages, user_id="alice", metadata={"category": "movies"})

ERROR:

ValueError: shapes (0,1536) and (384,) not aligned: 1536 (dim 1) != 384 (dim 0)

parshvadaftari avatar Mar 11 '25 22:03 parshvadaftari

Hello, I have recreated this bug. The issue is caused by "embedding_dims": 384. To fix it, you need to change it to "embedding_dims": 1536.

I have attached the image of reproduced bug

Image

alokjha2 avatar Mar 12 '25 17:03 alokjha2

@alokjha2 It's not the solution to the bug. As many embedding models don't support more than 768 dims for eg: nomic. Then this fix cannot solve the issues which is faced.

parshvadaftari avatar Mar 12 '25 17:03 parshvadaftari

@Dev-Khant I can pick this issue!

parshvadaftari avatar Mar 12 '25 17:03 parshvadaftari

I checked this code, and it works with the dimension set to 1536.

Regarding your statement: "It's not the solution to the bug, as many embedding models don't support more than 768 dimensions (e.g., Nomic). This fix cannot resolve the issue being faced."

Yes, many embedding models don’t support more than 768 dimensions, but the model the user is using for embedding generation supports 1536

alokjha2 avatar Mar 13 '25 05:03 alokjha2

And if you think something else is causing this issue, what do you believe it is, and how would you approach fixing it?

alokjha2 avatar Mar 13 '25 05:03 alokjha2

I checked this code, and it works with the dimension set to 1536.

Regarding your statement: "It's not the solution to the bug, as many embedding models don't support more than 768 dimensions (e.g., Nomic). This fix cannot resolve the issue being faced."

Yes, many embedding models don’t support more than 768 dimensions, but the model the user is using for embedding generation supports 1536

This embedding model does but not every right. So in short this becomes a feature request where you can set any supported embedding dimensions. I'm still looking into it where the issue is occuring. Will link the PR once solved.

parshvadaftari avatar Mar 13 '25 15:03 parshvadaftari