hemelb
hemelb copied to clipboard
Nbc rebase
see #591
This is just to see if I can use anything for the RBC branch
retest this please
Should we merge this then?
No, because I need to implement the switch from wait to test outlined in #590. The version as-is hurts single core and scaling performance.
retest this please
Note that Jenkins will need updated. Until this branch is merged, I've just edited the config in the GUI which will get overwritten somehow from the test_jobs branch. Need to remove the -i option from the command line and remove the ImageComparison test line. Should also removing the HEMELB_USE_STREAKLINES configure define.
So this passes now! I discovered the odd performance was specific to my OpenMPI install and has vanished since I rebuilt it. Once I've tested it on ARCHER, we can merge.
retest please
test this please
retest this please
@jenshnielsen I cannot get this PR to retest in Jenkins. Is the "retest this please" hook still working?
Could you give me admin rights to the HemeLB organisation. @jamespjh changed the url to the Jenkins service so we probably need to change the webhook endpoint
@jenshnielsen I just made you Owner of UCL-CCS (that's the organisation behind the hemelb-dev/ repo). Am I getting this right?
Yes I will have a look and you can remove me again when it works
retest this please, innit
This PR will close #647
@mdavezac I'd appreciate it if you could have a glance at the problem with MPI+cmake on Legion+GNU. It is certainly possible there's an issue in the mpi.cmake file that checks there is MPI 3.0 available. I'm just going to check this on ARCHER
For some reason this was stalled in Jenkins on the simulation shut down. Works fine on my MacBook, so retest this please to find out if this is just a one off.
Pleased to say that unit and regression tests are passing on ARCHER. Gonna test scaling now.
retest this please
@rupertnash, the hand-rolled MPI version thingie breaks on legion with openmpi 1.8.4 (and gcc). It corresponds to version 3.0. Are you sure you need to roll out your own FindMPI + extra script?
I'm not too keen on investigating why hand-rolled stuff breaks :( Unless it's my own hand-rolled crap :)