flood
flood copied to clipboard
Running Flood with >8k torrents locks rTorrent in IO loop
A user wrote:
I tried it with about 8k, the response of /api/client/torrents was about 7 MB. It also seems to have put itself in a loop of checking the status of every file (a very long operation for me), which completely locked up rtorrent until I killed the server. Edit: Just to be clear, I was never even able to reach the overview page. 'Flood Settings' had to yet to complete when I gave up and killed the server. Edit 2: Even with the server gone rtorrent is still locked up :(
loop of checking the status of every file
By this I mean when looking at the strace, it was polling every file with stat, similar to a save_session call. This is synchronous and takes a long time for me (more than a minute), long enough to drop peer connections. Watching the stack trace, I would see a scan finish, a short wait time where it was responsive, then the next scan would kick off. rTorrent never crashed, it just never got out of the IO deadlock long enough to do anything useful. Luckily the XMLRPC tool I use hangs until it's able to send the command, so I issued a quit and played the waiting game.
Also, I did confirm my install worked fine for an instance with a sane number of torrents.
Thanks for the additional information! I'll look into this.
Is there any way to trace the XMLRPC calls that are sent to rtorrent? I'm not too familiar with nodejs, but I'd be happy to perform any troubleshooting, even if it means applying a patch file or something.
@kannibalox Sorry for the delay, I totally forgot to reply to you.
You can see the XMLRPC calls that are sent to rTorrent by checking out the contents of this file: https://github.com/jfurrow/flood/blob/master/server/util/scgi.js. The XML is generated here and stored in the variable xml.
So you could write console.log(xml); to see it in the Node sever's output, or if the XML is particularly large, you can append the value to a file and inspect its contents with a text editor. This snippet should work for that:
let fs = require('fs');
fs.appendFile('/path/to/file', xml);
Specifically line number 39 is where the request is sent to rTorrent.
@kannibalox It just dawned on me that this bug is probably caused by requesting d.free_diskspace= for every torrent... I'm going to feel really dumb if this is the case.
If you wouldn't mind testing this for me, I'd be super grateful. Try commenting out two lines in this file: https://github.com/jfurrow/flood/blob/master/shared/constants/torrentGeneralPropsMap.js (yes this file is messy AF, I'm working on cleaning this up right now): line number 49 and line number 120.
You'll need to kill and restart the Flood server for your changes to take effect.
There might be other properties that cause I/O here also...
I did the change you propose and the same behavior.
Hello, any news on this?
Indeed, I'm fed up with ruTorrent being a pile of...
Things needed from even just reading the issue tracker: Scheduled removals ( https://github.com/jfurrow/flood/issues/371 ). Move data actually working ( https://github.com/jfurrow/flood/issues/581 ). Scalability without locking up rTorrent for minutes on end.
#581 and #371 could probably be done with unlimited time from the volunteer who maintains this project, but "Scalability without locking up rTorrent for minutes on end." is incredibly hard even if there was unlimited time.
Any news on this - almost 2 years now :)
Any news on this - almost 2 years now :)
It's an extremely hard problem to solve, feel free to give it a try yourself. I'm more than willing to lock up my instances in pursuit if need be.