core
core copied to clipboard
Ken's PRs
- [x] GMSH v4 support - https://github.com/SCOREC/core/pull/374
- [x] Extrusion tags - https://github.com/SCOREC/core/pull/375
- [x] https://github.com/SCOREC/core/pull/380 - in master @ https://github.com/SCOREC/core/commit/229f849247bdb3b329f3e63a1a56828cfb137cf6
- [x] merge CGNS reader/writer - https://github.com/SCOREC/core/pull/296
- this is needed for CEEDPhasta
- [x] chef support for part count reductions - https://github.com/SCOREC/core/pull/384 - in develop @ 31ddc29
- [ ] CGNS writer with a single 'base' - https://github.com/SCOREC/core/pull/395
- [ ] convert with option to skip
verify(...) - [ ] chef support for matched/periodic faces - https://github.com/SCOREC/core/issues/377 - needs testing
- [ ] topo model closure - https://github.com/SCOREC/core/issues/369
- [ ] write 'face type' tags for boundary layer entities only - https://github.com/SCOREC/core/issues/381
- [ ] check for high numbers of upward adjacencies and fail early - https://github.com/SCOREC/core/tree/cws/failUpwardAdjLimit
meeting notes 11/4/2022
- for a (partial) summary of how parallel mesh construction works see https://github.com/SCOREC/core/issues/245
- the current HEAD of
develophas added multi-topology support via repeated calls toassemblefollowed byfinalize(see https://github.com/SCOREC/core/blob/831133720ad071abd462b65a627921e47780fe49/apf/apfConvert.h#L34-L67) - the branch that exists for
matchedNodeElementReaderhas one call to theconstructapi which takes in anumElements*maxDownwardVertsarray for element-to-vertex connectivity (where vertices are defined by their global vertex id) - the mesh generator that feeds
matchedNodeElementReaderalready produces elements in groups by topological type and can/will be changed to create files that have one array (numElements(elmType)*numDownwardVerts(elmType))- a test input for this will be created for the next meeting on 11/11
- with the mesh generator change the modifications to
constructwill be abandoned and assemble will be called instead- the
MeshInfostruct that handles these arrays in thematchedNodeElementReader.cppdriver will have to be modified to support split topology arrays
- the
meeting notes 11/18/2022
- multi topo file format
- coordinate file does not change
- element file
- header per block/topology:
- num verts per element
- num elements
- header per block/topology:
- source of matching info
- if the mesh generator does not provide per-entity matching info (remote copies), then a serial code (ideally in PUMI) should derive the entity level matching info from model attributes specified on model edges/faces
- matching support in chef
- ideally, a model attribute per face/edge pair defines whether matching is enabled on that pair
- if a specific pair is enabled, then the mesh entities that are matched must have remote copy information for their matches
- currently in chef matching is on (for all model face/edge pairs) if the mesh contains matching info and has to be disabled explicitly (which again, turns all model face/edge pair matching off)
- in the current code, changing the default to 'off' would be helpful
- ideally, a model attribute per face/edge pair defines whether matching is enabled on that pair
- extrusion mesh generation with matching
- two cases, info is the same, but mesh generation and structure differs
- perfect matching in a simple extrusion across one pair of geometric model faces
- 2d root plane, any number of points along z, with each layer having varying z depths
- two cases, info is the same, but mesh generation and structure differs
meeting notes 12/2/2022
- CWS will assume the files exist for reading in the multi-topo format discussed last time and write a draft of the reading logic for it
- KJ will prepare an example multi-topo formatted file
meeting notes 12/9/2022
- https://github.com/SCOREC/core/pull/380 created, compiles, multi-topo reader is a WIP
- proposed file format for element-to-vertex multi-topology meshes
- vertex coordinate file reading does not change
- part ids start at 0
- each part gets a file that contains a rectangular array, one for each topology present on that part, that provides element to vertexGlobalId connectivity in the order listed in the section of the header file for that part
- one header file with:
1 1 No idea why we write this
3 also not sure what this is used for
<numelTotal> <maxNodesPerElement>
Part1
<numel_topo_1> <NodesInElementTopo1>
<numel_topo_2> <NodesInElementTopo2>
…. for as many topos as are in Part 1
Repeat the above bock for each part.
meeting notes 1/6/2022
- found latest branch for chef reduction (across ALCF, NASA, colorado viz nodes) and added the branch to the OP list
- added 4 part test case for matchedNodeElementReader to pumi-meshes https://github.com/SCOREC/pumi-meshes/commit/7733c3ee82cc589ef378816c1202322e264c98c1
- mixed topo
- in new file format
- 4 parts: 0,1 - all hex, 2 - mixed, 3 - all wedges
meeting notes 1/13/2022
- as of https://github.com/SCOREC/core/commit/b46f2c88ea7fc52bfd1a950d3804ea307c27d70e the header and the conn arrays are being read for multi-topo
- there is one hack in the code for setting the dimension here: https://github.com/SCOREC/core/blob/b46f2c88ea7fc52bfd1a950d3804ea307c27d70e/test/matchedNodeElmReader.cc#L776 I didn't see it in the header: https://github.com/SCOREC/pumi-meshes/commit/7733c3ee82cc589ef378816c1202322e264c98c1#diff-93a634d03353fa0e6384bb7a8842220b948f1ec18a25e1082afd200cb4c32aa6
- running the https://github.com/SCOREC/pumi-meshes/commit/7733c3ee82cc589ef378816c1202322e264c98c1 test case fails with a floating point exception and the output pasted below
$ mpirun -np 4 ./test/matchedNodeElmReader \
/space/cwsmith/core/pumi-meshes/matchedNodeElementReader/geom3D.cnndt \
/space/cwsmith/core/pumi-meshes/matchedNodeElementReader/geom3D.coord \
NULL \
/space/cwsmith/core/pumi-meshes/matchedNodeElementReader/geom3D.class \
NULL \
NULL \
/space/cwsmith/core/pumi-meshes/matchedNodeElementReader/geom3DHead.cnn \
foo.dmg foo.smb
numVerts 352068
2 46941 8
2 18222 6
1 65037 8
3 153102 6
0 64704 8
isMatched 0
CR1 mysize=0
CR5: self=3,myOffset=0,quotient=0
0 after residence
2 after residence
3 after residence
1 after residence
0 done inside remotes
1 done inside remotes
2 done inside remotes
3 done inside remotes
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 28879 RUNNING AT cranium.scorec.rpi.edu
= EXIT CODE: 136
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Floating point exception (signal 8)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
cwsmith@cranium: /space/cwsmith/buildPumiOptonSimonOmegaoff $
meeting notes 1/20/2022
- Fixed the FPE bug with https://github.com/SCOREC/core/commit/cacacacec80ebc037ca63cc6893dd9f5c13c94c1
- The run now fails in mesh verification ('verify'). See the output below.
- @KennethEJansen Does the
setCoords int overflow of:output mean we are overflowing?
cwsmith@cranium: /space/cwsmith/buildPumiOptonSimonOmegaoff $ ../runMner.sh
numVerts 352068
3 153102 6
1 65037 8
0 64704 8
2 46941 8
2 18222 6
isMatched 0
constructResidence: self=0,gid=0,ifirst=0
constructResidence: self=1,gid=0,ifirst=0
constructResidence: self=2,gid=91452,ifirst=0
constructResidence: self=3,gid=488,ifirst=0
constructResidence: self=0,gid=0,ifirst=0,max=352067
constructResidence: self=1,gid=0,ifirst=0,max=352067
constructResidence: self=2,gid=91452,ifirst=0,max=352067
constructResidence: self=3,gid=488,ifirst=0,max=352067
CR1 mysize=88017
CR5: self=3,myOffset=264051,quotient=88017
2 after residence
0 after residence
1 after residence
3 after residence
0 done inside remotes
1 done inside remotes
2 done inside remotes
3 done inside remotes
setCoords int overflow of: self=0,mySize=88017,total=352068, n=88017,to=0, quotient=88017, remainder=0 start=0, peers=4, sizeToSend=2112408, nverts=88017
setCoords int overflow of: self=1,mySize=88017,total=352068, n=88017,to=1, quotient=88017, remainder=0 start=88017, peers=4, sizeToSend=2112408, nverts=88017
setCoords int overflow of: self=2,mySize=88017,total=352068, n=88017,to=2, quotient=88017, remainder=0 start=176034, peers=4, sizeToSend=2112408, nverts=88017
setCoords int overflow of: self=3,mySize=88017,total=352068, n=88017,to=3, quotient=88017, remainder=0 start=264051, peers=4, sizeToSend=2112408, nverts=88017
fathers2D not requested
seconds to create mesh 1.044
APF FAILED: apf::Verify: edge with 2 adjacent faces
centroid: (0.684542, 0.00348426, 0.0833333)
based on the following:
- edge is classified on a model region
we would expect the adjacent face count to be at least 3
APF FAILED: apf::Verify: edge with 2 adjacent faces
centroid: (0.53322, -0.00227891, 0.0833333)
based on the following:
- edge is classified on a model region
we would expect the adjacent face count to be at least 3
APF FAILED: apf::Verify: edge with 2 adjacent faces
centroid: (0.684311, 0.000954821, 0.0833333)
based on the following:
- edge is classified on a model region
we would expect the adjacent face count to be at least 3
APF FAILED: apf::Verify: edge with 2 adjacent faces
centroid: (0.538636, -0.0196729, 0.0833333)
based on the following:
- edge is classified on a model region
we would expect the adjacent face count to be at least 3
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 32670 RUNNING AT cranium.scorec.rpi.edu
= EXIT CODE: 134
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Aborted (signal 6)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
meeting notes 2/3/2023
- all tests passing in the PRmatch2B PR
- fixed broken github CI tests by adding
pumi.htophCook.ccfor the memory query function ... - update 2/4/2023 which in-turn breaks builds with simmetrix. See https://github.com/SCOREC/core/issues/382
meeting notes 2/17/2023
- cgns merged - there is one CI failure that needs to be resolved
- moved chef part count reduction up to the next PR
- ken will provide a test that doesn't exit cleanly
- on colorado:
/projects/tools/SCOREC-core/core/pumi-meshes/phasta/4-1-Chef-Tet-Part/4-4-Chef-Part-ts20/run - changed split factor from 1 to -2 in adapt.inp
- fails on free (double free we assume)
- on colorado:
- cws will rebase off develop and take a look at the failure
meeting notes 2/24/2023
- see https://github.com/SCOREC/core/pull/384
meeting notes 3/17/2023
- debugged leak in chef reduction and fixed it
- see https://github.com/SCOREC/core/pull/384#issuecomment-1474017507
meeting notes 6/30/2023
- merged chef reduction branch - see #384
meeting notes 6/13/2024
- added CGNS writer with a single 'base' - https://github.com/SCOREC/core/pull/395 to list
- resuming weekly meetings starting on 6/21
meeting notes 6/21/2024
- rebased CGNS_oneBase on
develope765832 and pushed to https://github.com/SCOREC/core/tree/CGNS_OneBaseRB2Dev - need to merge the changes from the pumi-meshes branch (https://github.com/SCOREC/pumi-meshes/tree/CGNS_tests) into the master branch - this showed up as a conflict during the rebase onto develop
- issues accessing spack installed compilers on
viz003-spack load [email protected]reports 'invalid version specifier' - need to decide if we are going to keep the current PR or start a new one