Javet icon indicating copy to clipboard operation
Javet copied to clipboard

Feature Request: Support for V8 ValueSerializer and ValueDeserializer

Open joakouy opened this issue 6 months ago • 3 comments

Hi, first of all thank you for your great work on Javet.

I’m currently using JSON.stringify() and JSON.parse() to serialize and deserialize objects between Java and JavaScript via V8. However, in my use case, performance is critical, and the JSON-based approach is becoming a bottleneck.

Feature Request:

I would like to request support for using V8’s ValueSerializer and ValueDeserializer directly from Javet.

Specifically, I’m looking for functionality that allows:

  • Serializing a V8 object to a byte[] in Java.
  • Deserializing a byte[] back into a V8 object in the JS runtime.
  • Ideally, this could be done by referring to a known object by name, like:
byte[] bytes = v8.serialize(parent, "myObject");
v8.deserialize(parent, "myObjectRestored", bytes);

Is this possible to implement?

If this is something feasible and aligns with the goals of the project that would be really awesome.

Thanks again!

joakouy avatar Jun 11 '25 11:06 joakouy

I think that's a quite interesting feature request. It will take sometime to investigate if this feature could be implemented.

caoccao avatar Jun 11 '25 15:06 caoccao

I think that's a quite interesting feature request. It will take sometime to investigate if this feature could be implemented.

Thanks for the quick reply. I just discovered that NodeJs exposes the API but I'm not getting very good numbers. It seems that for small objects, the JSON way is much faster. I'm running this directly in Node 22.16.0 LTS (no Javet):

const { serialize, deserialize } = require('v8');
const ITERATIONS = 1000

function generateObj(size) {
  const obj = {};
  for (let i = 0; i < size; i++) {
    obj[`key_${i}`] = `value_${i}`;
  }
  return obj;
}

function benchmark(label, fn) {
  const start = process.hrtime.bigint();
  for (let i = 0; i < ITERATIONS; i++) fn();
  const end = process.hrtime.bigint();
  const durationMs = Number(end - start) / 1_000_000;
  console.log(`${label}: ${durationMs.toFixed(3)} ms`);
}

function runBenchmark(sizes) {
  for (const size of sizes) {
    const o = generateObj(size);
    console.log(`\n--- Testing object with ${size} key-value pairs ---`);

    benchmark('JSON.stringify + JSON.parse', () => {
      const str = JSON.stringify(o);
      const obj = JSON.parse(str);
    });

    benchmark('v8.serialize + v8.deserialize', () => {
      const buf = serialize(o);
      const obj = deserialize(buf);
    });
  }
}

runBenchmark([10, 100, 1000, 5000, 10000]);

I get:

--- Testing object with 10 key-value pairs ---
JSON.stringify + JSON.parse: 3.314 ms
v8.serialize + v8.deserialize: 9.052 ms

--- Testing object with 100 key-value pairs ---
JSON.stringify + JSON.parse: 29.756 ms
v8.serialize + v8.deserialize: 34.846 ms

--- Testing object with 1000 key-value pairs ---
JSON.stringify + JSON.parse: 530.831 ms
v8.serialize + v8.deserialize: 334.644 ms

--- Testing object with 5000 key-value pairs ---
JSON.stringify + JSON.parse: 2807.902 ms
v8.serialize + v8.deserialize: 2991.014 ms

--- Testing object with 10000 key-value pairs ---
JSON.stringify + JSON.parse: 5619.037 ms
v8.serialize + v8.deserialize: 6298.722 ms

I thought the internal v8 serialization would be way faster than using JSON :scratching head: Maybe Node is adding something on top?...

joakouy avatar Jun 11 '25 16:06 joakouy

Thank you for the sharing.

That's exactly one of the concerns I hold: It might not improve the performance in some common cases.

caoccao avatar Jun 11 '25 17:06 caoccao