Interrupted iteration over a SqliteDict instance
I have a file created by sqlitedict in python2.7, file size is 11.37GB (number of keys:761951).
Recently I opened it with read-only flag in Python3.6. Something interesting:
Scenario A: If it is iterated over all 761951 keys (for key,value in dict.items()), the memory consumption is very low. Like this:

Scenario B: if the iteration was break prematurely, say at iteration counter=100, then call dict.close(). the memory consumption raises high. Like this:

Scenario C: if the iteration was break prematurely, say at iteration counter=100, but not to call dict.close(). The memory consumption keeps growing over 10 seconds after the execution has came out from the iteration. the memory consumption raises high as Scenario B.
(All the Scenario A,B,C are running in a thread by twisted "reactor.callInThread")
It suggests that the iteration was still undergoing even it breaks.
This is the same issue as #48 I think. Querying the items lets a thread stuff all of the items() into a Queue(). If the Queue() is being consumed, memory usage stays normal. If there is no consumer, it will continue filling up.