canvas
canvas copied to clipboard
Ram leak?
Seems like if I don't use clearAllCache() then the memory goes up every time something is rendered.
Confirm this issue.
The issue does not exist in 0.1.38, but it does exist in 0.1.44
Basically there is a huge memory leak, after generating a few hundred images (and discarding them), the RSS usage went up to > 1.4 GB
Even after clearAllCache(), it only recovered around 400MB (1.0 GB used)
In 0.1.38, the same test is always around 600 MB rss, never above 700 MB
In the picture below, I am running 0.1.44, and the following code in the 3 commands:
util.inspect(process.memoryUsage())
clearAllCache()
util.inspect(process.memoryUsage())
In the picture below, I am running 0.1.38, and while the RSS does spike to around 1GB while the pictures are generating and shortly after, within 15 seconds the RSS is back to ~600MB as you can see here:
Test conditions:
Node v: 18.16.1 Windows 11 and Linux Debian 10.13 @napi-rs/canvas version 0.1.38 (ok) and 0.1.44 (bugged) I generated 200 images per test, size 488 x 331 via new Canvas() and getContext('2d'), with 56 individual png's loaded as Images via loadImage(), overlaid onto the parent image with Canvas.drawImage(), and a simple text header drawn via 2dcontext.drawText(), exported as pngs, and discarded. Repeated the test with all images drawn 'at the same time' via promiseArray.push()... await Promise.all(promiseArray); and also await'ing each image, with virtually identical results. The results shown above are from the promiseArray variant.
I have too memory leak.
Same here, even clearAllCache() doesnt work.
Confirm this issue.
The issue does not exist in 0.1.38, but it does exist in 0.1.44
Basically there is a huge memory leak, after generating a few hundred images (and discarding them), the RSS usage went up to > 1.4 GB
Even after clearAllCache(), it only recovered around 400MB (1.0 GB used)
In 0.1.38, the same test is always around 600 MB rss, never above 700 MB
In the picture below, I am running 0.1.44, and the following code in the 3 commands:
util.inspect(process.memoryUsage())
clearAllCache()
util.inspect(process.memoryUsage())
In the picture below, I am running 0.1.38, and while the RSS does spike to around 1GB while the pictures are generating and shortly after, within 15 seconds the RSS is back to ~600MB as you can see here:
Test conditions:
Node v: 18.16.1 Windows 11 and Linux Debian 10.13 @napi-rs/canvas version 0.1.38 (ok) and 0.1.44 (bugged) I generated 200 images per test, size 488 x 331 via new Canvas() and getContext('2d'), with 56 individual png's loaded as Images via loadImage(), overlaid onto the parent image with Canvas.drawImage(), and a simple text header drawn via 2dcontext.drawText(), exported as pngs, and discarded. Repeated the test with all images drawn 'at the same time' via promiseArray.push()... await Promise.all(promiseArray); and also await'ing each image, with virtually identical results. The results shown above are from the promiseArray variant.
I have the issue in 1.1.38 to
I confirm the issue, RAM never get released, with 0.1.44 Downgrading to 0.1.37 fixes the issue
I tested on 0.1.45
and saw no sigificant memory leak when inspecting process.memoryUsage()
but once my application was containerised I saw HUGE memory leaks in the docker stats
memory usage graphs.
I managed to narrow it down to calling canvas.toBuffer()
or canvas.encode()
, as soon as I left this line out, the memory leak stopped occuring, but I also had no output...
I'm talking about 60~80MB of memory increase per render of a 8MB PNG.
Consistently, per render, that much increase I was getting over 1GB usage while on 0.1.41
it never goes above 200MB and after being idle for a bit sits on 75MB
And yes I also tried clearAllCache
it did nothing.
Since this package has bindings to a native binary, it makes sense nodejs does not detect the memory leak.
I have not seen others report this yet, so figured I would report my findings. Hope it gets fixed sometime, until then pinning version to 0.1.41
works for me.
@Brooooooklyn If you could fix this, people would be very grateful. It's literally unusable unless you pre-generate images. There is 100% a leak and if you know where it is, that would be helpful for someone making a fix.
https://github.com/Brooooooklyn/canvas/blob/f3a699bbb5a1fae69ebb8a02499ce6279624afe1/src/lib.rs#L259
Is the usage of noop here the cause?
I'm looking into this issue, what kind of image do you generate with @napi-rs/canvas
? I'm trying to create a minimal repro
In my case, a canvas of size 3000x925.
Method or attribute used (maybe usefull for a reproduction code):
fillStyle, fillRect, textAlign, setLineDash, strokeRect, save, restore, translate, font, textBaseline, beginPath, moveTo, lineTo, closePath, fill, stroke, lineWidth, strokeStyle
Then I call canvas.toBuffer('image/jpeg')
to get my image.
Apparently this lead to the memory leak, but I didn't tested myself.
I have 3 registered fonts.
@SteadEXE thank you, I'll try
Does anyone actually catch a real-world oom? I'm wondering if the RSS is allocated memory, not the actually used memory.
In my case, the process eat all the available RAM on Windows, until system itself runs out of memory. Even after an idle period, RAM never got released, node process had to crash or be restarted.
After some test, I can confirm after generating 1000+ images on my side I reach 4.5GB+ RAM, while I never go over 300 MB in 0.1.37.
Does anyone actually catch a real-world oom? I'm wondering if the RSS is allocated memory, not the actually used memory.
The issue is an actual memory leak, as Bun.gc(true) would clear eg. large arrays or other things. The size of the leak goes up with image quality, so it's likely related to the buffer itself not being destroyed properly.
Just simply generating a few captchas on-demand is basically impossible, which was how I found it.
import { createCanvas } from "@napi-rs/canvas";
function test() {
const canvas = createCanvas(600, 600);
const ctx = canvas.getContext("2d");
ctx.fillStyle = "black";
ctx.fillRect(0, 0, canvas.width, canvas.height);
// The memory leak is much worse when the quality is high. It only leaks when this is called.
await canvas.encode("webp", 100);
}
for (let i = 0; i < 5000; i++) {
test();
}
setInterval(() => {
Bun.gc(true);
console.log(process.memoryUsage().rss / 1024 / 1024);
}, 1000);
Can you try this version to see if it has fixed your issue? https://github.com/Brooooooklyn/canvas/releases/tag/v0.1.48 ?
Can you try this version to see if it has fixed your issue? https://github.com/Brooooooklyn/canvas/releases/tag/v0.1.48 ?
Edit: It's only leaking when you encode with png, webp was fixed. Not sure about the others.
For me, it seems like it's still leaking the same amount on 0.1.48, or in any case, it's still leaking regardless. 48k calls to this = 7474 MB.
const canvas = createCanvas(200, 200);
const ctx = canvas.getContext("2d");
ctx.fillStyle = "black";
ctx.fillRect(0, 0, canvas.width, canvas.height);
const encoded = await canvas.encode("png");
@7te3ep I can't reproduce the leak with the code:
import { createCanvas } from './index.js'
async function test() {
const canvas = createCanvas(2560, 1920)
const ctx = canvas.getContext('2d')
ctx.fillStyle = "black";
ctx.fillRect(0, 0, canvas.width, canvas.height);
await canvas.encode("webp", 100);
}
for (let i = 0; i < 5000; i++) {
await test();
global?.gc?.()
console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}
setInterval(() => {
console.info(process.memoryUsage().rss / 1024 / 1024 + " MB");
}, 1000);
On my MacBook, the output was like:
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
285.234375 MB
As you can see, the memory usage is very stable
@7te3ep I can't reproduce the leak with the code:
import { createCanvas } from './index.js' async function test() { const canvas = createCanvas(2560, 1920) const ctx = canvas.getContext('2d') ctx.fillStyle = "black"; ctx.fillRect(0, 0, canvas.width, canvas.height); await canvas.encode("webp", 100); } for (let i = 0; i < 5000; i++) { await test(); global?.gc?.() console.info(process.memoryUsage().rss / 1024 / 1024 + " MB"); } setInterval(() => { console.info(process.memoryUsage().rss / 1024 / 1024 + " MB"); }, 1000);
On my MacBook, the output was like:
285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB 285.234375 MB
As you can see, the memory usage is very stable
I'm not sure if you were replying to me, but it's leaking with png, not webp. webp is fine now.
Confirmed that PNG is still leaking, I will fix it immediately.
For me it looks like the issue is fixed, thank you 🥇