piker
piker copied to clipboard
`OSError: too many files open` on linux
It's odd because i've seen this from 2 diff sources in the #302 history but not sure either are legit.
Main source of issue seems to be that we're loading boat loads of data and one of 2 sys level calls trigger this:
- socket re-connection of some sort to the FSP engine inside the
triostreams apis - our
ShmArray's usage of stdlib'sSharedMemorycauses it when attaching (not creating) to an existing buffer repeatedly (again something the fsp engine will do on task respawn-resyncs when a new history frame is loaded.
The next time someone sees this please put in the whole traceback. I've stuck some preventative code in #302 but it didn't seem to do much for me iirc.
Another thing we cam try for the shm case is a busy loop around the non-create call:
key = 'blah'
shm = SharedMemory(
name=key,
create=True,
)
while True:
shm = SharedMemory(
name=key,
create=False,
)
and see if that triggers it, in which case it's likely a lower level bug we can report?