bulkupdate
bulkupdate copied to clipboard
Error on putting lots of log entries
When I fill my handle_entity with lots of self.log() calls, I sometimes get the following error:
cannot put more than 500 entities in a single call
Traceback (most recent call last):
File "/base/python_lib/versions/1/google/appengine/ext/webapp/__init__.py", line 512, in __call__
handler.post(*groups)
File "/base/python_lib/versions/1/google/appengine/ext/deferred/deferred.py", line 256, in post
run(self.request.body)
File "/base/python_lib/versions/1/google/appengine/ext/deferred/deferred.py", line 124, in run
return func(*args, **kwds)
File "/base/python_lib/versions/1/google/appengine/ext/deferred/deferred.py", line 166, in invoke_member
return getattr(obj, membername)(*args, **kwargs)
File "/base/data/home/apps/my-loc/1a.341384336210760638/bulkupdate/__init__.py", line 264, in _run
db.put(log_entries)
File "/base/python_lib/versions/1/google/appengine/ext/db/__init__.py", line 1244, in put
keys = datastore.Put(entities, rpc=rpc)
File "/base/python_lib/versions/1/google/appengine/api/datastore.py", line 284, in Put
raise _ToDatastoreError(err)
BadRequestError: cannot put more than 500 entities in a single call
Apparently although bulkupdate is tracking how many of my entities it will put before doing so in a batch, it's easily foiled by trying to put too many of its own log entries. I can remove .log statements, obviously, but it would be nice if bulkupdate didn't fail like this.
Known issue - you really shouldn't be logging this much. Still a bug, though, so I'll leave it open.