invidious icon indicating copy to clipboard operation
invidious copied to clipboard

[Meta] Invidious has a memory leak

Open TheFrenchGhosty opened this issue 5 years ago • 20 comments

This has already been reported multiples time, but like #1345 this is an issue made to regroup other issues where this problem was reported.

This is also the issue where we will keep track of the progress in fixing it.


Invidious has a massive memory leak.

As shown in this screenshot...

image

...the memory usage of invidious can increase to 4GB in 15 minutes, starting from less than 500MB, the dips in memory usage are restarts or crashes, another issue related to crashes is available here: #1439.

The speed at which the memory leaks happen depend on the number of users of an instance. The instance of the screenshot is a public instance used by thousands of users.

Previous issue where this was reported: #1415 #721

Previous discussion about it: #1051

TheFrenchGhosty avatar Oct 24 '20 19:10 TheFrenchGhosty

A memory leak is probably not true. There are probably multiple ones.

SuperSandro2000 avatar Oct 25 '20 02:10 SuperSandro2000

i get memory leaks precisely on 12AM every night and if i change time zone in settings it reoccurs on 12AM for new time zone even if it is actually a bright day outside - looks like intentional malicious code is injected in about half of the instances

GitWaifu avatar Nov 07 '20 07:11 GitWaifu

looks like intentional malicious code is injected in about half of the instances

Why does it look like that? Never heard of this "12AM memory leak" before. Do you run backups at 12AM? Do you run some other cronjobs at this time? Does it also happen on a new clean server?

Perflyst avatar Nov 07 '20 11:11 Perflyst

@GitWaifu

looks like intentional malicious code is injected in about half of the instances

So, according to you, omarroth intentionally added code to trigger a memory leak that end up crashing the instances, because he absolutely don't want you to host it?

Seriously? Don't use the software if you don't like it, but stop saying bugs are caused by malicious intent.

PS: read that: https://en.wikipedia.org/wiki/Correlation_does_not_imply_causation

TheFrenchGhosty avatar Nov 07 '20 18:11 TheFrenchGhosty

there is a way to run a valgrind or something similar ?

Malicious intent even if it's probably not the first cause should not be exclude. That kind of product make shadow to google and as they already did with useragent for Firefox or internet explorer. Why they not try by all ways to shutdown this kind of project. Also not in this ecosystem but in javascript one several lib were and will compromise it's the side effect of dependancies managers (should not be used if you want my point of view).

The instance with trouble may use an installer (i did) and third party can in this case be the root cause to.

HumanG33k avatar Nov 26 '20 19:11 HumanG33k

i get memory leaks precisely on 12AM every night and if i change time zone in settings it reoccurs on 12AM for new time zone even if it is actually a bright day outside

Sounds more like invidious does not handle day change to well and somewhere there is leaking memory.

SuperSandro2000 avatar Nov 28 '20 19:11 SuperSandro2000

I have done a test on an instance on localhost (so I am sure nobody can connect to it). I launch it, watch a video and then I stop using it (I even close the page in the browser to make sure no javascript is running in the back). I can see that the invidious process is using some CPU every 1 to 2 min and just after the consumption of the container increase by ~2MB and never drop.

Not sure if it helps.

doc75 avatar Nov 29 '20 09:11 doc75

@SuperSandro2000

i get memory leaks precisely on 12AM every night and if i change time zone in settings it reoccurs on 12AM for new time zone even if it is actually a bright day outside

Sounds more like invidious does not handle day change to well and somewhere there is leaking memory.

Really interesting... it indeed might be a thing, it's a bit strange though...

TheFrenchGhosty avatar Dec 01 '20 21:12 TheFrenchGhosty

ok so we have a start. an egrep on the code with value related to time can find something. a try with 2 minutes values (sec, min, millisec and hexa).

maybe it can be a bug with code implementation.

@doc75 when you test on localhost you mean without network ? if not or if there is no restriction on google ip somebody can always connect to your device in bad code case.

HumanG33k avatar Dec 04 '20 23:12 HumanG33k

@HumanG33k , when I tested with local network I meant that i had a network connected to the internet, but for sure no incoming connection. You are right that invidious was able to connect to the internet by itself. The information I want to give is that it seems to leak also when none is connected.

I also notices that after a while it stops growing (so it might be linked to initialization), but I did not let it run for more than 7-8 hours.

I will try without internet access to see if the memory also increase.

doc75 avatar Dec 05 '20 08:12 doc75

Some more details can be found in this issue: https://github.com/iv-org/documentation/issues/241#issuecomment-1167236305

TheFrenchGhosty avatar Jul 01 '22 16:07 TheFrenchGhosty

image Please prioritize this. This is getting annoying.

pistasjis avatar Oct 29 '22 18:10 pistasjis

I do not observe this. I have been running Invidious in Docker in unprivileged LXC (on ZFS) for two years, only upgraded once in a while with Watchtower. I never restart and it never crashes.

I did a manual check to verify if there is any change in memory consumption on my Hypervisor during a docker compose down && docker compose up -d, together with an update. This is the graph: image

I issued the down & up command at 6:41 AM and no impact can be seen.

Sieboldianus avatar Oct 18 '23 04:10 Sieboldianus

Thank you for your input @Sieboldianus.

There is actually still some memory leaks/issues that can easily be reproducible when running invidious under high load like on a public instance.

It's great though that on a private instance, low traffic one, that there is no issue anymore.

unixfox avatar Oct 30 '23 22:10 unixfox

I have a public instance with at most 10 users hosted on a tiny machine with 1GB of RAM. The memory causes a crash every few hours

It isn't instantly obvious but builds up over time. It went from 56% usage to 70% usage after 10 videos but doesn't drop down

gptlang avatar Nov 15 '23 21:11 gptlang