elmerfem
elmerfem copied to clipboard
Build ElmerIce in CI on all platforms and run tests
This PR changes the build rules for the CI to build with ElmerIce on all platforms.
Also run the tests that have the label elmerice-fast in the CI.
Additionally, some changes are made to allow building with the Ninja generator (avoid building the same Fortran module twice) and that allow running the ElmerIce tests on Windows (add paths to libraries to PATH environment variable).
Some tests are failing currently. I haven't looked into the reason why.
I applied the patch currently for the full tests and created an elmerice test that is run on branch "elmerice" automatically (with max 4 mpi tasks). The thing is that the tests do not pass on any platform. Hence them not working is not a really good sign that something has been broken. A major reason for this is that elmerice tests are often more complicated (representing actual science) that the other tests. This has partly lead to that the cases are not always run and many have broken over the years. I would hope that we would fix the 3 cases before running elmerice in all tests.
I added just elmerice tests, currently 10 fail on gcc. https://github.com/ElmerCSC/elmerfem/actions/runs/10401394765/job/28803865353
Thank you for adopting parts of the proposed changes. But my main motivation for this change was to add CI before tackling #521 (which is Windows-only). So, the parts that you adapted don't help much for that.
If you prefer not to accept the changes in this PR, I can still use them by repeatedly rebasing on different branches and running the CI in my fork. That makes it a bit awkward. But that might still work.
Just to understand (for potential later PRs): What is your motivation to not run these tests on all platforms? Is ElmerIce supposed to only work on Linux when built with GCC?
The reason is that there are many failing elmerice tests currently. See https://github.com/ElmerCSC/elmerfem/actions/runs/10401394765/job/28803865353#step:8:1047
This being the case all the tests would fail and we loose the success of tests as an indicator. Of course people can go and click which of the tests passed. Typically the history of elmerice tests is related to some thesis work while standard Elmer tests are more like unit tests.
We are currently looking at the tests. So that at least the fast ones could be added to the general testing.
elmerice as a community is small but active, whereas the general community is much larger but not as active. I don't think almost anybody does the computational glaciology on a windows PC, maybe Mac's are even more frequent.
Thank you for clarifying.
I understand that few users are using ElmerIce on Windows. Actually, it doesn't even build without some changes from this PR. But still it might be nice if there was an easy way to check that no new regressions are introduces while trying to address other issues.
If you prefer to not let the CI run for ElmerIce currently, I'll try and at least separate the changes that are needed to build and run the tests locally from this PR and open a new one with just those changes. I hope that will be fine. (Edit: See #524.)
It looks like some tests are stuck (indefinitely?) on macOS 14 (that is on Apple Silicon in case that should matter):
Start 24: Friction_Coulomb
3/12 Test #24: Friction_Coulomb .................***Timeout 1500.03 sec
It looks like some tests are stuck (indefinitely?) on macOS 14 (that is on Apple Silicon in case that should matter):
Start 24: Friction_Coulomb 3/12 Test #24: Friction_Coulomb .................***Timeout 1500.03 sec
is there any output from the test available?
is there any output from the test available?
The CI logs contain the following:
Start 24: Friction_Coulomb
3/12 Test #24: Friction_Coulomb .................***Timeout 1500.03 sec
Starting program Elmergrid, compiled on Aug 15 2024
Elmergrid reading in-line arguments
Output will be saved to file rectangle.
Elmergrid loading data:
-----------------------
Loading the geometry from file 'rectangle.grd'.
Loading ElmerGrid file version: 210903
Defining the coordinate system (2-DIM).
Loading 2 subcell limits in X-direction
Loading 2 subcell limits in Y-direction
Loading material structure
LoadElmergrid: materials interval is [1,1]
Loading boundary conditions
Found 4 boundaries
Reached the end of command file
Found 1 divisions for grid
Loading ElmerGrid commands from file 'rectangle.grd'.
Reached the end of command file
Read commands from a file
Elmergrid creating and manipulating meshes:
-------------------------------------------
1 cells were created.
Numbered 451 knots in 400 4-node elements.
Numbering order was <x><y> and max levelwidth 43.
40 element sides between materials -1 and 1 were located to type 1.
10 element sides between materials -2 and 1 were located to type 2.
40 element sides between materials -3 and 1 were located to type 3.
10 element sides between materials -4 and 1 were located to type 4.
Coordinates defined in 2 dimensions
Elmergrid saving data with method 2:
-------------------------------------
Saving mesh in ElmerSolver format to directory rectangle.
Reusing an existing directory
Saving 451 coordinates to mesh.nodes.
Saving 400 element topologies to mesh.elements.
Saving boundary elements to mesh.boundary.
Saving header info to mesh.header.
Thank you for using Elmergrid!
Send bug reports and feature wishes to [email protected]
-- BINARY_DIR = /Users/runner/work/elmerfem/elmerfem/build
IIUC, that is only the output of ElmerGrid (which seems to have run successfully). There is no output at all from ElmerIce in the logs afaict.
I don't know if that is because there was no output or because the test suite prevents it from showing up. (I haven't quite understood yet what is being done wrt standard and error output streams and why. But afaict, the output of tests is redirected manually to files that are (optionally) being displayed at a later time. Maybe that "later time" never happens if there was a timeout. I thought that ctest had built-in options that handle displaying the output anyway. But there might be a valid reason why handling stdout and stderr was "taken out of the hands" of ctest...)
At least one nice thing with the redirect is that we can go to the tests directory and study the std out & err. Each test has them in its own test directory. I don't know whether this is one such reason. Maybe @juhanikataja has on idea?
Manually rebased to resolve merge conflict.
It looks like only one single test is failing currently with the tests on Ubuntu.
Would it increase the chances of this PR being accepted if I focused on that platform first and deferred the other platforms to potentially later?
It looks like ElmerIce cannot be built currently without MPI.
This is not a practical limitation since all Elmer/Ice users probably need MPI.
Thanks for clarifying. 👍
I pushed an additional commit that checks early on if MPI is enabled when ElmerIce should be built.
I additionally modified the ForceToStress_parallel test to only run if 4 MPI processes are allowed. That test is currently failing in CI because the GitHub hosted Ubuntu runners have only two cores. I'm not sure if this is the best approach to do that. But it should do what we want here.
Marking as ready for review.
Some runners failed with a strange error. (Maybe, an issue with the connection of some of GitHub's data centers?) But the ones that matter for this PR (the Ubuntu runners) pass all tests.
This should be ready for being merged.