ParMmg icon indicating copy to clipboard operation
ParMmg copied to clipboard

Interfacing with parmmg using a parallel mesh data structure

Open prudhomm opened this issue 5 years ago • 4 comments

We have a parallel mesh data structure in feelpp that provide all kind of information regarding its parallel distribution (ghosts, edge, faces...). When looking at the examples, a few things are unclear:

  • what parmesh->info.root returns ?
  • does _centralized mean that the mesh is built sequentially then distributed and vice/versa?
  • can we set up a parmmg mesh data structure directly with setters and getters like in sequential?
    • how do you handle interprocess data? how do you stitch/deal with the interface between processors?
    • how about the ghosts cells? don't you need them?

prudhomm avatar Feb 21 '20 07:02 prudhomm

see feelpp/feelpp#1430 for the internal discussion and interface

prudhomm avatar Feb 21 '20 08:02 prudhomm

Hello,

  • Normally, parmesh->info.root returns the rank of the process in charge of centralised I/O operations. By default, it is rank 0.
  • Yes, _centralized means that Parmmg takes as input a sequential mesh, remesh it in parallel, and returns a merged mesh on rank 0. In contrast to `_distributed', where a distributed mesh is provided as input and a distributed mesh is obtained as output.
  • If you want to handle a distributed mesh coming from a solver, the example to look at is in libexamples/adaptation_example0/parallel_IO/manual_IO/main.c. There we manually partition a cube mesh on either 2 or 4 procs, in order to show how to use the _Init_, _Set_ and _Get_ functions (mostly analogous to the Mmg ones) that can be used on each process to set/get its part of the mesh. The only novelty in the API, with respect to Mmg, is in fact in the Set/Get functions for partition interfaces. We deal with element-based partitions where interprocess interfaces (which are called communicators in Parmmg) are just made of tetrahedra faces and the nodes on them, without ghost cells. This is really conceived as an additional layer of information on top of a classical Mmg mesh, so the distributed input just asks for a tetrahedral mesh, with possibly a triangular boundary, plus some lists of interfaces faces/nodes with a global ID to match them on the two processes sharing them.

The distributed input is shown in action at lines 704--754 in the above example. The user can choose whether to give us interface faces information or interface nodes information by means of the PMMG_IPARAM_API_modeparameter, which takes the value PMMG_APIDISTRIB_faces or PMMG_APIDISTRIB_nodes. Then, depending on the choice, the user should provide:

  • The number of faces/nodes communicators (i.e. the "edges" on the process communication graph) through the PMMG_Set_numberOfFaceCommunicators function (respectively, nodes).
  • The size of each communicator, given its label and the color (== rank) of the process "on the other side of the interface" through the PMMG_Set_ithFaceCommunicatorSize function.
  • Two arrays of local and global indices of the triangles/nodes on the interface, for each communicator, through the PMMG_Set_ithFaceCommunicator_faces function. The last argument tells Parmmg if the given array entries are already ordered in the same way on the two process sharing them (0 if ordered, 1 if not).

These functions are documented in src/libparmmg.h, from line 1872 on. Two caveats:

  • If you provide face communicators, these faces need to be set as triangles before the triangles indices are passed to the communicator setter function.
  • If you provide node communicators, a node shared by multiple processes should be set on each pair of interacting processes.

Regarding the feelpp discussion: Please note that surface adaptation in Parmmg is not ready yet (i.e. it is forbidden by default), but it is currently under development. Please do not hesitate to come back for more clarifications.

Yours, Luca

lcirrottola avatar Feb 21 '20 16:02 lcirrottola

Hello Luca, Do you have an ETA for the surface adaptation in ParMmg?

prj- avatar Feb 21 '20 19:02 prj-

Hello, We are working on it at the moment, so we want to deliver it fully-fledged for June.

lcirrottola avatar Feb 24 '20 10:02 lcirrottola