quickjs-emscripten-sync icon indicating copy to clipboard operation
quickjs-emscripten-sync copied to clipboard

Memory leak when only using marshalling

Open Matthiee opened this issue 1 year ago • 0 comments

Issue

When having code that returns data from VM -> Host, the memory used will keep on increasing each time the evalCode call is made.

Additional context

"quickjs-emscripten": "^0.23.0",
"quickjs-emscripten-sync": "^1.5.2",

We are using this package to marshal a lot of classes, objects, functions from the Host -> VM. Because of this, we are using a single long-lived QuickJsContext.

Our host needs to call methods on the instances of classes created inside the VM, these functions may return either simple data objects or Promise containing simple data and will use functions made available from the Host -> VM.

We have no use case for using the syncing functionality provided by this package.

It appears that because of the syncing capabilities of this package, memory is being retained by the QuickJsRuntime for each call made to the ´evalCode´ that needs to return some data.

What did I already try

Disable isHandleWrappable and isWrappable. While this appeared to work at first glance, it however doesn't work when the code is async/await. It also doesn't work when using moduleLoader.

Expected

No memory is being retained when not using the syncing capabilities.

Reproduce

  • memory no leak will log an increase of 948.
  • memory leak will log a number much higher.

If you change the amount of iterations of the arena.evalCode("globalThis.test.check()");, the memory used for the no leak scenario will remain at 948 while the leak scenario will keep on increasing for each iteration.

  test("memory no leak", async () => {
    const ctx = (await getQuickJS()).newContext();
    const arena = new Arena(ctx, { isMarshalable: true });

    const getMemory = () => {
      const handle = ctx.runtime.computeMemoryUsage();
      const mem = ctx.dump(handle);
      handle.dispose();
      return mem;
    };

    arena.evalCode(`globalThis.test = {
      check: () => {
        return {
          id: 'some id',
          data: 123
        };
      }
    }`);

    const memoryBefore = getMemory().memory_used_size as number;

    for (let i = 0; i < 10; i++) {
      const handle = ctx.unwrapResult(ctx.evalCode("globalThis.test.check()"));
     const data = ctx.dump(handle);
     handle.dispose();

     expect(data).toStrictEqual({id: 'some id', data: 123});
    }

    const memoryAfter = getMemory().memory_used_size as number;

    console.log("Allocation increased %d", memoryAfter - memoryBefore);

    arena.dispose();
    ctx.dispose();
  });

  test("memory leak", async () => {
    const ctx = (await getQuickJS()).newContext();
    const arena = new Arena(ctx, { isMarshalable: true });

    const getMemory = () => {
      const handle = ctx.runtime.computeMemoryUsage();
      const mem = ctx.dump(handle);
      handle.dispose();
      return mem;
    };

    arena.evalCode(`globalThis.test = {
      check: () => {
        return {
          id: 'some id',
          data: 123
        };
      }
    }`);

    const memoryBefore = getMemory().memory_used_size as number;

    for (let i = 0; i < 10; i++) {
      const data = arena.evalCode("globalThis.test.check()");
      expect(data).toStrictEqual({id: 'some id', data: 123});
    }

    const memoryAfter = getMemory().memory_used_size as number;

    console.log("Allocation increased %d", memoryAfter - memoryBefore);

    arena.dispose();
    ctx.dispose();
  });

Matthiee avatar Nov 13 '23 16:11 Matthiee