sanic
sanic copied to clipboard
Memory leak serving static files
Describe the bug The memory of the process is increasing for everytime I request a static file.
Code snippet
import os
app = Sanic(__name__)
here = os.path.dirname(__file__)
app.static("/static", os.path.join(here, "static")
if __name__ == "__main__":
app.run()
Access a static file. The bigger the file the more memory will be tied up. http://localhost:8000/static/index.html
Expected behavior Memory should be cleared up after serving a static file or at least not linearly increase with each request.
Environment (please complete the following information):
- OS: macOS BigSur 11.5.2
- Version 21.9.1
Update: I have been trying to replicate our setup as close as possible and can now quite reliably reproduce the issue. Could you have a look here and let me know if this works for you? https://github.com/seibushin/sanic-mem-leak
Looking into this.
Same here. Memory keeps increasing until server crashes. I am not sure if the memory leak is caused by files, but for info, i am using response.file method when i send files. I'll try to gather more info about this bug. OS: Ubuntu 20.04 sanic==21.6.2
@ahopkins is there anything we can do to better record what's going on with the memory leak?
@robd003 @seibushin Which loop are you using? asyncio
, uvloop
, trio
? It also would be helpful to know if you are using stream_large_files
or not.
I'm using uvloop, not using stream_large_files
File size is between 100KB - 500KB.
Thanks @robd003 I am investigating now.
@Cayke How about you? What is general file size? Over what time period does this occur? About how much traffic are you seeing?
@ahopkins do you need any info from the running sanic process? Anything the community can do to help debug this issue?
I thought I was on to something, then I was on to thinking that maybe there was a larger problem. I dug into that (turns out no problem there) and now I can not reproduce anything :joy:
I will try again some more over the next week or so. I have gone thru this very carefully, and nothing jumps out at me as a something out of scope or not cleaned up unless there is something in one of the dependencies. Which is ultimately what lead me to another conversation we have been having on Discord about the HUGE performance penalty incurred by aiofiles
versus stdlib blocking.
I couldn't reproduce the issue, every time the requests ended, I observe the memory being cleaned up by Sanic. I simply used a json file with size of 100MB (or a little bit lower than 100MB) as a testing static file, and sending GET requests with a 10 threads Python client.
Wondering how can I reproduce the issue.
FWIW, it is a good practice to set None
all variables that refer to other objects once you are finished with them. This breaks cycles that prevent instant deletion of objects when they are no longer needed. Such cycles are very easy to create when many objects hold references to each other, and then they will only be freed much later, when the GC runs.
I am inclined towards closing this unless we have a good way to easily reproduce this. @sjsadowski Do you think you could give it a whirl on your Mac?
I have created a repo to reproduce the issue. Could you have a look here? https://github.com/seibushin/sanic-mem-leak
Let me know if that works for you and sorry for the late reply.
Initial memory usage:
After holding down command + shift + R for maybe 1 or 2 seconds
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is incorrect, please respond with an update. Thank you for your contributions.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is incorrect, please respond with an update. Thank you for your contributions.