update jsoo for testing
opening this draft for windows CI testing purposes
Try with this diff
diff --git a/compiler/test/dune b/compiler/test/dune
index a8b92d0c..111938bc 100644
--- a/compiler/test/dune
+++ b/compiler/test/dune
@@ -22,4 +22,4 @@
(libraries grain grain-tests.framework grain-tests.suites grain-tests.utils)
(modules test)
(js_of_ocaml
- (flags --no-sourcemap --quiet)))
+ (flags --no-sourcemap --quiet --disable use-js-string)))
I'm wondering if the slowdown could be due to yojson reading "cmi" pattern
The current implementation:
- first read the
cmisection into somebytes - convert
bytestostring -
Yojson.from_stringconvert thestringback intobytes
An alternative approach can mitigate the perf issue https://github.com/hhugo/grain/commit/543d5a9255d4fd305c29ff772ec83edf7163d865
Thanks @hhugo! I've cherry-picked that commit into this branch to see it in action. Any idea why this issue is only showing up now (with this upgrade)?
Thanks @hhugo! I've cherry-picked that commit into this branch to see it in action. Any idea why this issue is only showing up now (with this upgrade)?
js_of_ocaml-compiler.5.1.0 changed the default representation of strings.
With the commit, you no longer need to disable use-js-string.
I've created a branch with additional changes.
see https://github.com/hhugo/grain/tree/hhugo-jsoo-oom
@hhugo the tests have still gone from 3-4 minutes to 16 minutes with the newer JSOO. This seems untenable for us since our compiler binaries are produced via jsoo. Do you have any other ideas why there would be a 400% slowdown between the versions?
@phated, would you be able to rebase this branch, I have some time to look at this again.
@hhugo I rebased this. We changed the way a lot of the code that we previously thought needed updating works, so once the CI finishes we should see how long the tests take and then go from there. Thanks for looking into this again!
@phated, I've identified a change that make every thing slow. https://github.com/ocsigen/js_of_ocaml/pull/1409
You enable backtraces unconditionally in module_resolution.
Jsoo will create a js Error for every exception raised to capture backtraces.
Try enabling backtraces only when not running with jsoo.
let () = Printexc.record_backtrace(Sys.backend_type != (Other("js_of_ocaml")));
I've make some cleanup in https://github.com/hhugo/grain/tree/cleanup but it's missing an upgrade to js_of_ocaml-compiler.5.8.2. I don't know how to use esy properly.
@hhugo thanks! Locally, at least, that seems to be the issue. I committed that change here so we can see how fast the tests run. We'll likely also pull the rest of your cleanup into this branch. Really appreciate you looking into this!
It looks like the tests are still running slow. Locally I just compiled a program and compared the performance against main and with the changes it seemed to be roughly the same, with the new jsoo being a tad slower than before. I ran the tests locally and confirmed that they're running slow there too. I also disabled the backtraces for the tests. Looks like they're 3-4x slower than before.
I think the issue is that 3rd party libs enable backtraces. See https://github.com/reasonml/reason-native/blob/20b1997b6451d9715dfdbeec86a9d274c7430ed8/src/rely/Util.re#L12
You can try overriding caml_record_backtrace with
//Provides: caml_record_backtrace
function caml_record_backtrace(b){
// caml_record_backtrace_flag = b;
return 0;
}
Here is what I have locally after tweaking the generated code manually.
$ node _esy/default/store/b/grain__s__compiler-dc7bdbee/default/test/test.bc.js
Running 45 test suites
PASS aliased types
PASS abstract types
PASS recursive types
PASS function types
PASS tuples
PASS strings
PASS stdlib
PASS early return
PASS records
PASS provides
PASS pattern matching
PASS parsing
PASS optimizations
PASS numbers
PASS modules
PASS loops
PASS lists
PASS linking
PASS let mut
PASS includes
PASS functions
PASS foreigns
PASS exceptions
PASS enums
PASS cyclic redundancy checks
PASS comments
PASS chars
PASS boxes
PASS blocks
PASS basic functionality
PASS arrays
PASS aux/wasm_utils
PASS utils/string_utils
PASS utils/mini_bigint
PASS utils/markdown
PASS utils/literals
PASS aux/concatlist
Test Suites: 0 failed, 8 skipped, 37 passed, 45 total
Tests: 0 failed, 405 skipped, 784 passed, 1189 total
Time: 131.838s
Gotcha, that looks like it. Just pushed that change.
@ospencer, I've just merged https://github.com/ocsigen/js_of_ocaml/pull/1637, would you be able to test js_of_ocaml-compiler master here ? (dropping the compiler/test/hacks.js override)
@hhugo Apologies for the delay, I just pushed that change.
Many changes on jsoo master lately. Would you be able to test the latest commit of jsoo#master and report ?
@hhugo done. We'll see how the tests report!
Dune doesn't build it seems
CI seems happy in https://github.com/hhugo/grain/actions/runs/11974464203/job/33385597702?pr=2
Test Suites: 0 failed, 7 skipped, 36 passed, 43 total
Tests: 0 failed, 385 skipped, 741 passed, 1126 total
Time: 338.145s
I'm now seeing the following result locally, using #2210
Test Suites: 0 failed, 8 skipped, 37 passed, 45 total
Tests: 0 failed, 405 skipped, 784 passed, 1189 total
Time: 158.413s
I'm now seeing the following result locally, using #2210
Test Suites: 0 failed, 8 skipped, 37 passed, 45 total Tests: 0 failed, 405 skipped, 784 passed, 1189 total Time: 158.413s
It looks like there are some issues when running in ci.
esy fail locally as well. I can't help on that part. I ended up relying on opam and dune only for my contribution.
CI seems to be passing on everything but windows on my updated branch here using esy, I think the issue im running into on windows may be related to this
with https://github.com/grain-lang/grain/pull/2254, i'm seeing the following timing
Test Suites: 0 failed, 9 skipped, 37 passed, 46 total
Tests: 0 failed, 442 skipped, 792 passed, 1234 total
Time: 47.736s
Have you speed up the testsuite somehow ?
with #2254, i'm seeing the following timing
Test Suites: 0 failed, 9 skipped, 37 passed, 46 total Tests: 0 failed, 442 skipped, 792 passed, 1234 total Time: 47.736sHave you speed up the testsuite somehow ?
The speedup is probably from our switch to custom artifact files over wasm linking.
Closing as this work was completed with #2323