flood-for-transmission icon indicating copy to clipboard operation
flood-for-transmission copied to clipboard

Poor responsiveness with many torrents or torrents with many files

Open dgcampea opened this issue 2 years ago • 13 comments

Poor UI responsiveness with a ~400 torrent library, with most buttons or actions taking 1 second to respond to user input. In torrents with a lot of files (15.000 ?), selecting which files to download from the menu that comes out when double-clicking a torrent entry can take more than 10 seconds to list and perform any action such as select/unselect/etc.

I've tested with Firefox and Chromium and the issues are present in both browsers (Chromium is running without any extensions). Transmission version: 3.00 flood-for-transmission version: c4bbb0cf38fe3c614b8808ec4b10eb10bcaca76e

dgcampea avatar Jan 30 '22 14:01 dgcampea

Did this only recently start or was this already the case before? Do you have a before/after commit difference?

I am kind of aware of the file structure thing, I have seen that happen too, I guess it might need some optimization. But I have never seen any issue with the torrent list etc. Then again I don't have such a big collection so it would require some testing to see what's going on.

I'm sorry, but I don't think I will have time to look at this soon. But I will definitely have a look when I have the time/energy to do so! ;)

johman10 avatar Jan 30 '22 15:01 johman10

Did this only recently start or was this already the case before?

It wasn't recent AFAIK, I started to notice this trend as more and more torrents were added (over time).

I'm sorry, but I don't think I will have time to look at this soon. But I will definitely have a look when I have the time/energy to do so! ;)

Thank you! :+1:

dgcampea avatar Jan 30 '22 16:01 dgcampea

I am to experiencing much of the same problems. I'm not sure when it started to happen, but seems like it has steadily become more and more of a problem as the list of torrents I have grow. Currently at around 300, and just doing simple text search takes 2-3 seconds where everything freezes and then it completes the search and resumes again.

I tried opening htop while doing a text search and see that Firefox for instance then uses 100% CPU on a single core until the search is complete.

Flood: dd047bd Transmission: 3.00 (bb6b5a062e) Via: linuxserver/transmission:latest (3.00-r5-ls115) Docker Image

Edit 1: I also tried testing switching to the native UI and also transmission-web-control, both of which reacts more or less instantly when searching through the torrents, so does not appear to be any transmission/server/network/browser limitations, but rather something specific to flood.

kristianvld avatar Mar 13 '22 23:03 kristianvld

Hi! I started experimenting a little with this, but no real luck yet. I created a branch with some attempts though, feel free to have a look: https://github.com/johman10/flood-for-transmission/tree/investigate-performance-issues.

What I tried:

  • Don't filter Torrents, but rather hide them
  • Remove some mousemove event listeners for the resizeable table which shouldn't really be there
  • Debounce search so that it doesn't trigger on every character added/removed.

Neither of which clearly solved the issue. I'm not sure what the next steps are, but at least I'm able to reproduce it when I throttle my CPU in the Chrome Performance tab.

johman10 avatar Apr 15 '22 10:04 johman10

I guess the standard thing to do is virtualised tables? It's definitely a last resort thing and shouldn't be needed for 300 rows, but if it's already close to optimal it's a good shot at mitigating performance issues

si14 avatar Apr 16 '22 02:04 si14

Yeah, I was considering it, but I would really rather not. It add a whole bunch of complexity on a fairly simple list. Plus Svelte should be fast, I'm probably just doing something wrong and my guess is that the filters have too many side effects to actually be fast.

I would at least like to poke at it more before diving into virtualized lists! :)

johman10 avatar Apr 16 '22 04:04 johman10

Fair enough, I agree it's a huge pain :)

si14 avatar May 01 '22 18:05 si14

Yeah, I was considering it, but I would really rather not. It add a whole bunch of complexity on a fairly simple list. Plus Svelte should be fast, I'm probably just doing something wrong and my guess is that the filters have too many side effects to actually be fast.

I believe you are probably right. I do not believe that we are anyway near pushing enough dom elements currently for that to be the problem, as you mentioned svelte should be fast for this stuff. So instead, there is probably some logic that is faulty, maybe some double iterative list looping or similar. 400 elements even with sub-elements should be fairly simple to render and search through. Heck, using browser builtin CMD-F/CTRL-F works instantly when searching all the text on the page.

Here are also some demonstrations from a talk Rich Harris had about svelte comparing how fast it is and how many elements it can render: https://rethinking-reactivity.surge.sh/#slide=24 https://rethinking-reactivity.surge.sh/#slide=25

kristianvld avatar May 02 '22 15:05 kristianvld

So instead, there is probably some logic that is faulty, maybe some double iterative list looping or similar. 400 elements even with sub-elements should be fairly simple to render and search through.

I think the debouncing helped quite a bit here, just like in the example slide you shared. But it isn't great yet. I think what might be the problem is the interlacing of store depedencies which trigger multiple rerenders. I might have to simplify some things in that area to improve it. But I'm not seeing a super straightforward way here as of right now.

Heck, using browser builtin CMD-F/CTRL-F works instantly when searching all the text on the page.

This doesn't modify the DOM though, so for that reason it's super fast. However, this is another reason not to use virtualized lists, since it will break that behaviour.

Either way, this should be possible, without virtualized lists, but I haven't found the solution yet. If anyone wants to review my code and point out anything that looks off, you're very welcome! :)

johman10 avatar May 02 '22 17:05 johman10

I am now experiencing the issue, too. I don't know if you were able to reproduce it, if not this might help: I'm seeing A LOT of POSTs to /transmission/rpc even before the page is able to open, here's a bunch of POST bodies:

{"arguments":{"ids":"recently-active","fields":["name","percentDone","metadataPercentComplete","recheckProgress","eta","rateDownload","rateUpload","sizeWhenDone","downloadedEver","uploadedEver","peersSendingToUs","peersConnected","peersGettingFromUs","addedDate","labels","trackers","uploadRatio","trackerStats","status","error","id","magnetLink","bandwidthPriority"]},"method":"torrent-get"}
{"arguments":{"fields":["alt-speed-down","alt-speed-enabled","alt-speed-up","speed-limit-down-enabled","speed-limit-down","speed-limit-up-enabled","speed-limit-up","units"]},"method":"session-get"}
{"arguments":{"ids":"recently-active","fields":["name","percentDone","metadataPercentComplete","recheckProgress","eta","rateDownload","rateUpload","sizeWhenDone","downloadedEver","uploadedEver","peersSendingToUs","peersConnected","peersGettingFromUs","addedDate","labels","trackers","uploadRatio","trackerStats","status","error","id","magnetLink","bandwidthPriority"]},"method":"torrent-get"}
{"arguments":{"ids":"recently-active","fields":["name","percentDone","metadataPercentComplete","recheckProgress","eta","rateDownload","rateUpload","sizeWhenDone","downloadedEver","uploadedEver","peersSendingToUs","peersConnected","peersGettingFromUs","addedDate","labels","trackers","uploadRatio","trackerStats","status","error","id","magnetLink","bandwidthPriority"]},"method":"torrent-get"}
{"arguments":{"fields":["alt-speed-down","alt-speed-enabled","alt-speed-up","speed-limit-down-enabled","speed-limit-down","speed-limit-up-enabled","speed-limit-up","units"]},"method":"session-get"}

Literally hundreds of them.

si14 avatar May 11 '22 23:05 si14

Oh wait, disregard that, sorry, it's just polling and I forgot to mount a volume, it wasn't actually lagging for so long. Sorry 😄

si14 avatar May 12 '22 01:05 si14

Re-posting what I wrote in #561 here instead:

I have over 1000 torrents and the UI freezes for a couple seconds whenever I change the sort column/filter with lots of torrents showing.

I tried building and running the dev version to do some profiling but I don't know enough about frontend dev and even less about Svelte to figure much out.
What seems to standout is that Firefox is spending a lot of time garbage collecting (over 100ms at a time during page rebuild), so I'm guessing something is generating lots of temporary objects and it's causing GC pressure causing the freezes.
Chromium also has the same symptoms but I didn't try profiling the there.

I tried deleting the contents of the <tr> in Torrent.svelte and that mostly solves the problem, so investigation can probably focus on that part of the code. But that's probably obvious since that's what scales with the number of torrents.

Here's a tarball with 1500 torrents I generated so you can easily reproduce this: torrents.tar.gz
Note that there's no tracker configured and they're marked private so they should generate no network traffic at all. Generated with:
for ((i=0; i<1500; i++)); do echo "$i" > file && mktorrent file --private --no-date --output "$(printf '%04d' "$i")".torrent; done

Thanks for the project, I've been using it for a while and really like it but performance is unfortunately making it hard to use nowadays.

ThinkChaos avatar Feb 11 '24 05:02 ThinkChaos

Hi everyone! I made some changes in how requests are handled in the latest version (https://github.com/johman10/flood-for-transmission/pull/558).

The UI uses a polling approach to fetching torrents, and before if the previous request wasn't finished a new request would start anyway. With the changes in #558 this should not happen anymore, which could have a positive effect on the performance overall.

I would love to have some feedback on this if you have any. Has the performance improved, has it degraded or remained the same? Please let me know as input for further improvements down the line.

johman10 avatar Apr 28 '24 10:04 johman10