dbf icon indicating copy to clipboard operation
dbf copied to clipboard

Bulk insertion speed

Open nmoreaud opened this issue 3 years ago • 0 comments

Hello

With this code I can insert 900 rows/sec with a SSD. On a network folder it inserts 200 rows/sec.

for i in range(50000):
    data = (str(i), str(i) + '.jpeg', 'Lorem ipsum dolor sit amet', '{GCF2807B-E86B-CAC7-5F21-4B18C46ADDD0}', '441000', '4637070', '5', '600', '600', '791', '6', '41', '27', '0', '0', '0', '1.5', '1753322669', '0', '440998', '443998', '4634072', '4637072', '440998', '443998', '4634072', '4637072', 0, 1)
    mydbf.append(data) # write line per line

This doesn't seem faster :

metadata.append(multiple=50000) # 7s
i = 0
for rec in dbf.Process(metadata):
    data = (str(i), str(i) + '.jpeg', 'Lorem ipsum dolor sit amet', '{GCF2807B-E86B-CAC7-5F21-4B18C46ADDD0}', '441000', '4637070', '5', '600', '600', '791', '6', '41', '27', '0', '0', '0', '1.5', '1753322669', '0', '440998', '443998', '4634072', '4637072', '440998', '443998', '4634072', '4637072', 0, 1)
    rec[:] = data
    i += 1

Is there any sub-second bulk insert method?

nmoreaud avatar Jan 26 '22 14:01 nmoreaud