proposal-binary-ast icon indicating copy to clipboard operation
proposal-binary-ast copied to clipboard

Clarify experiment results

Open domenic opened this issue 7 years ago • 5 comments

The time required to create a full AST (without verifying annotations) was reduced by ~70-90%, which is a considerable reduction since parsing time in SpiderMonkey for the plain JavaScript was 500-800 ms for the benchmark.

Is "the time required to create a full AST" 500-800 ms? Or is it a subset of that? Maybe stating the actual reduction in milliseconds would be helpful.

domenic avatar Jul 19 '17 16:07 domenic

@domenic

Yeah the phrasing there is poor. In the meantime, looking at the bugzilla bug for the prototype, the 500-800ms is for AST construction, not full parse. I'll fix the wording for now, and we'll work on getting a proper table of numbers together.

See: https://bugzilla.mozilla.org/show_bug.cgi?id=1349917#c30

With this patch, we get the following speedups on the (2.5Mb gzipped) of Facebook source code Facebook, measuring the AST creation phase (which takes ~500-800 ms from source)

kannanvijayan-zz avatar Jul 20 '17 04:07 kannanvijayan-zz

I'll try to be as precise as possible.

On the SpiderMonkey side, the 500-800ms covered:

  • reading source from memory;
  • constructing the AST from source;
  • checking early errors;
  • a few pre-allocations that are performed by SpiderMonkey during parse-time (e.g. object literals).

Everything was done from/to memory, so file I/O was not included. This was benchmarked on a minified Facebook chat.

The prototype side implemented:

  • reading binary source from memory;
  • constructing AST from binary source;
  • NO checking of early errors;
  • an approximation of the pre-allocations performed by SpiderMonkey during parse-time.

Additional information:

  • profiling showed that the difference between lazy parsing and full parsing on this sample was below the noise level;
  • profiling showed that the cost of the pre-allocations had no meaningful impact either on SpiderMonkey or on the prototype for this benchmark;
  • on the other hand, I did not manage to extract meaningful data on the impact of early error checking on the 500-800ms.

Yoric avatar Jul 26 '17 17:07 Yoric

Do you have any performance numbers for other implementations? even just JS parsed by other engines vs. binary ast in your implementation?

ojhunt avatar Aug 02 '17 05:08 ojhunt

Oh, I see "The time required to create a full AST (without verifying annotations) was reduced by ~70-90%, which is a considerable reduction since SpiderMonkey's AST construction time for the plain JavaScript was 500-800 ms for the benchmark."

I don't feel this as a useful performance comparison because when writing the JSC parser I found that the semantic analysis and error checking was the bulk of parse time. In general parse time (actual lexical analysis and parsing) are linear to code size, even if there are theoretical super-linear paths they aren't actually hit in normal code.

Also, I'm unsure what numbers I should be looking at as the numbers in "news bench" don't clearly separate out what is being measured -- it's also unclear (given the size of the code and the content involved) if it is doing different things in different browsers, but that's a general case benchmark issue when dealing with actual content.

ojhunt avatar Aug 02 '17 05:08 ojhunt

More numbers are coming, once we have sufficiently progressed with the advanced prototype.

Yoric avatar Aug 02 '17 06:08 Yoric