Use `sin()`/`cos()` for the right type without need for widening.
The sin() and cos() calls are used on float arguments, but these functions are double by default (the corresponding float functions would be sinf() and cosf()), forcing the argument to be upcast to double and then the result back downcast to float.
Instead, use std::sin() and std::cos() that have overrides for each floating-point type.
(Note, there are also a few atan() calls but didn't modify them as the result is divided by some other value, so there might be accuracy losses that need to be assessed first).
Found by performance-type-promotion-in-math-fn clang-tidy check.
clang-tidy review says "All clean, LGTM! :+1:"
Looks like the relevant-looking test gpl.incremental02.tcl test that is breaking is not actually executed in bazel. I don't know how to find the error logs in Jenkins, so maybe we should get the gpl test first into bazel.
@oharboe you have done this a bunch before, is that simple to do ?
ran it manually and yes looks like there are some values with bits at the end a bit differently as expected. Not sure how much that matters, as it is in the noise (if it is important, then I'd suggest to switch to double calculation entirely instead of partially calculating in double then rounding back to float again).
@hzeller Looks like the test is executed in bazel.
bazelisk test src/gpl/test:*
Output:
//src/gpl/test:ar01-tcl_test PASSED in 0.3s
//src/gpl/test:ar02-tcl_test PASSED in 0.3s
//src/gpl/test:clust01-tcl_test PASSED in 0.4s
//src/gpl/test:clust02-tcl_test PASSED in 0.5s
//src/gpl/test:clust03-tcl_test PASSED in 0.3s
//src/gpl/test:cluster_place01-tcl_test PASSED in 0.3s
//src/gpl/test:convergence01-tcl_test PASSED in 1.6s
//src/gpl/test:core01-tcl_test PASSED in 0.3s
//src/gpl/test:density01-tcl_test PASSED in 0.4s
//src/gpl/test:diverge01-tcl_test PASSED in 0.3s
//src/gpl/test:error01-tcl_test PASSED in 0.3s
//src/gpl/test:incremental01-tcl_test PASSED in 0.2s
//src/gpl/test:incremental02-tcl_test PASSED in 19.1s
//src/gpl/test:nograd01-tcl_test PASSED in 0.5s
//src/gpl/test:simple01-obs-tcl_test PASSED in 0.3s
//src/gpl/test:simple01-rd-tcl_test PASSED in 0.6s
//src/gpl/test:simple01-ref-tcl_test PASSED in 0.3s
//src/gpl/test:simple01-skip-io-tcl_test PASSED in 0.2s
//src/gpl/test:simple01-tcl_test PASSED in 0.3s
//src/gpl/test:simple01-td-tcl_test PASSED in 0.8s
//src/gpl/test:simple01-td-tune-tcl_test PASSED in 1.4s
//src/gpl/test:simple01-uniform-tcl_test PASSED in 0.4s
//src/gpl/test:simple02-rd-tcl_test PASSED in 0.6s
//src/gpl/test:simple02-tcl_test PASSED in 0.3s
//src/gpl/test:simple03-rd-tcl_test PASSED in 0.6s
//src/gpl/test:simple03-tcl_test PASSED in 0.4s
//src/gpl/test:simple04-rd-tcl_test PASSED in 0.6s
//src/gpl/test:simple04-tcl_test PASSED in 0.3s
//src/gpl/test:simple05-tcl_test PASSED in 0.3s
//src/gpl/test:simple06-tcl_test PASSED in 0.3s
//src/gpl/test:simple07-tcl_test PASSED in 0.3s
//src/gpl/test:simple08-tcl_test PASSED in 0.3s
//src/gpl/test:simple09-tcl_test PASSED in 0.3s
//src/gpl/test:simple10-tcl_test PASSED in 14.5s
(I don't know how to update the golden files, is there a process ? I just copied the result to the *.ok file, but looks ilke the "Check that OK files are up to date" test disagrees.
clang-tidy review says "All clean, LGTM! :+1:"
@hzeller Looks like the test is executed in bazel.
bazelisk test src/gpl/test:*
mmh, then I am wondering why it was not complaining the same way the cmake test did. Does it properly compare the right files @oharboe ?
mmh, then I am wondering why it was not complaining the same way the cmake test did. Does it properly compare the right files @oharboe ?
Don't know...
Marking this as draft right now as there are various questions
- how do I would update golden files if needed ?
- why does the bazel test not complain but the cmake test did ?
- Is the noise in the lower bits acceptable ?
In the incremental02 test I saw with a handful of measurements I saw about 3% improvement, but that might be just noise. Need to implement a real unit test with a benchmark suite..
Marking this as draft right now as there are various questions
- how do I would update golden files if needed ?
Eventually, I'd like to see a bazel run ...testname_update rule to update the reference data, just like https://github.com/The-OpenROAD-Project/OpenROAD/tree/master/test/orfs#updating-rules_json-files
I expect you are seeing https://github.com/The-OpenROAD-Project/OpenROAD/pull/8314
There are save_ok/save_def_ok/etc script to update the golden files thought they consist mostly of copying the result to the corresponding ok file.
Low bit noise is probably ok but we will need to test that it doesn't happen to affect some design. Sometimes you can get the butterfly effect. Once the unit tests are sorted I'll run a suite.
incremental02 does seem to return a different DEF file as compared to the cmake build. It's likely it just comes down to a different compiler or compiler options
(I don't know how to update the golden files, is there a process ? I just copied the result to the
*.okfile, but looks ilke the "Check that OK files are up to date" test disagrees.
I think it should be:
bazelisk run //src/gpl/test:ar01-tcl_update
Rebased, and the manual test run with bazel seems to run (I don't have the cmake set-up anymore, so can't test that locally). If I understood correctly, that test now should actually compare the *.ok file after @hongted change ?
bazel test -c opt src/gpl/test:incremental02-tcl_test
Leaving as draft until all questions resolved.
clang-tidy review says "All clean, LGTM! :+1:"
Is the 'check that ok files are up to date' failure just expected because I changed the *.ok file ?
clang-tidy review says "All clean, LGTM! :+1:"