LightRAG icon indicating copy to clipboard operation
LightRAG copied to clipboard

[Question]: What is stable to put on production ?

Open bzImage opened this issue 9 months ago • 6 comments

Do you need to ask a question?

  • [ ] I have searched the existing question and discussions and this question is not already answered.
  • [ ] I believe this is a legitimate question, not just a bug or feature request.

Your Question

Tried Postgres backend for everything.. it has this error #1176 ..

Neo4j has this error #1179

What can we use that is stable .. as storage ?

Additional Context

No response

bzImage avatar Mar 24 '25 22:03 bzImage

I got it working using MongoDB, Redis, Neo4j and Milvus. Are you running everything in docker? Have you tested setting up each solution seperately so it's not comming down to bad connection configurations?

When load testing my stack I also got into a problem with neo4j hitting the Timeout connection, but that went away after setting the Neo4j (neo4j_impl.py):

MAX_CONNECTION_POOL_SIZE = int( os.environ.get( "NEO4J_MAX_CONNECTION_POOL_SIZE", config.get("neo4j", "connection_pool_size", fallback=400), ) )

To 400 instead of 50. You can use Cypher to find the max connection pool size of your Neo4j database. I'm using the community edition.

But I just posted an issue about the latency:

And because of the latency I was leaning twoards trying to run everything in PostgreSQL and see if the performance increases.

frederikhendrix avatar Mar 26 '25 08:03 frederikhendrix

I first tried with mongodb but.. 2 weeks ago it was broken ..

so tried with milvus- neo4j - redis .. neo4j is broken..

so i tried with postgres, but, it seems that when you really push documents/relations/entities.. (>1000 entities) it goes down in flames..

bzImage avatar Mar 27 '25 15:03 bzImage

BTW if you store everything in text files.. it works.. it even works faster than using a proper database behind..

bzImage avatar Mar 27 '25 15:03 bzImage

I first tried with mongodb but.. 2 weeks ago it was broken ..

so tried with milvus- neo4j - redis .. neo4j is broken..

so i tried with postgres, but, it seems that when you really push documents/relations/entities.. (>1000 entities) it goes down in flames..

Neo4j isn't broken and MongoDB isn't either. Just adding "file_path" fixed MongoDB and it got patched in the latest release. And Neo4j_impl.py just isn't optimised. I am currently retrieving with Hybird search within 1 second using Neo4j, Milvus, Redis, MongoDB and FastAPI with Gunicorn.

I was thinking of posting how I improved Neo4j to become 5x faster and a lot less CPU heavy.

frederikhendrix avatar Mar 28 '25 07:03 frederikhendrix

I was thinking of posting how I improved Neo4j to become 5x faster and a lot less CPU heavy. please do..

bzImage avatar Mar 28 '25 20:03 bzImage

I was thinking of posting how I improved Neo4j to become 5x faster and a lot less CPU heavy. please do..

I posted it, so I am hoping something is done with my information.

https://github.com/HKUDS/LightRAG/issues/1246

frederikhendrix avatar Apr 01 '25 08:04 frederikhendrix

All storage driver has undergone significant improvements. Please verify if the issue is resolved with the latest version.

danielaskdd avatar Jul 20 '25 01:07 danielaskdd