genpy
genpy copied to clipboard
generate_dynamic creates millions of tmp directories
I'm creating an application that reads in ros messages from a ros bag.
During the process of reading messages, generate_dynamic(core_type, msg_cat)
is being called to get the message type definition.
In the function generate_dynamic
, temp directories are being created as a temporary location for dynamically generated message classes.
# Create a temporary directory
tmp_dir = tempfile.mkdtemp(prefix='genpy_')
then, uses atexit
to remove the directory when the main module terminates.
# Afterwards, we are going to remove the directory so that the .pyc file gets cleaned up if it's still around
atexit.register(shutil.rmtree, tmp_dir)
but.....
my application is not meant to terminate and must run indefinitely.
In addition, atexit
does not get called if the following happens:
- the program dies because of a signal
- os._exit() is invoked directly
- a Python fatal error is detected (in the interpreter)
As a result, millions of genpy_XXXXXX directories are being created for each message my application receives in the \tmp
directory of my machine which later causes my machine to crash.
Do you have any suggestions on how to solve this issue? I'm thinking that there should be a check if there is an existing message type module before creating a new msg type module.
Please consider to contribute a pull request to accommodate your use case. You need to make sure to not break any API though since otherwise it can't be accepted.
Do you have any suggestions on how to solve this issue?
Maybe add an option argument to the function (tmp_dir=None
) and if it is passed the function doesn't create a temporary directory and doesn't register it at atexit
. Then you can pass an arbitrary directory and control the cleanup yourself.
I'm thinking that there should be a check if there is an existing message type module before creating a new msg type module.
You should be able to perform that check and only conditionally invoke the function, no?
Another potential mitigation here would be comparing with the local workspace and just using that message if the MD5 is a match, rather than generating fresh ones indiscriminately.