TypeError: 'NoneType' object has no attribute '__getitem__'
I've been getting lots of these errors when doing an elasticsearch delta snapshot across 35 nodes. I'm not sure if this is causing any issues but it is worrying. I didn't see these errors at all when doing the initial snapshot, which write many gigabytes of files. Is there an issue with listing or reading existing files? Should I be worried about data loss or corruption?
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/fuse.py", line 414, in _wrapper
return func(*args, **kwargs) or 0
File "/usr/local/lib/python2.7/dist-packages/fuse.py", line 483, in open
fi.flags)
File "/usr/local/lib/python2.7/dist-packages/fuse.py", line 881, in __call__
ret = getattr(self, op)(path, *args)
File "/usr/local/lib/python2.7/dist-packages/yas3fs/__init__.py", line 2430, in open
if not self.check_data(path):
File "/usr/local/lib/python2.7/dist-packages/yas3fs/__init__.py", line 1834, in check_data
etag = k.etag[1:-1]
TypeError: 'NoneType' object has no attribute '__getitem__'
It turns out that this does affect my snapshots, any incremental snapshot fails to finish, so I'm going to have to not use yas3fs for this. :(
Sorry to hear that, maybe the issue is that for every update, yas3fs has to upload the whole file again to S3. If there are large files with incremental writes, that can be a not easy workload to manage.