import-wikidata-dump-to-couchdb
import-wikidata-dump-to-couchdb copied to clipboard
use CouchDB bulk insert API
maxlath : hello! Any recommendation on how to import a data dump of 40GB+ of newline delimited JSON in CouchDB? I assume I should go with the bulk import API, but I can't just throw the whole dump at Couch at once, right? What would be the optimal/sustainable split size? thanks in advance :) jan____ : maxlath: 1k-10k batches should do