private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Ingesting large number of files

Open Codeluck opened this issue 1 year ago • 1 comments
trafficstars

I am trying to ingest about 340k files of about 30Gb size in total. After few hours of ingesting I got SQLite error: sqlite3.DataError: string or blob too big

Has anyone ever faced similar issue? Is it okay to ingest such amount of data? Is it really possible to avoid SQLite restrictions?

Codeluck avatar Feb 28 '24 20:02 Codeluck

Its using SQLLite as its using simple (filesystem based) indexes and doc stores. Once this https://github.com/imartinez/privateGPT/pull/1706 is pulled, those things can be moved into Postgres.

dbzoo avatar Mar 12 '24 01:03 dbzoo