SAMRAI icon indicating copy to clipboard operation
SAMRAI copied to clipboard

exact role of "tag_buffer" in the TimeRefinementIntegrator

Open nicolasaunai opened this issue 2 years ago • 4 comments

Documentation says

tag_buffer array of integer values (one for each level that may be refined) representing the number of cells by which tagged cells are buffered before clustering into boxes.

this is not entirely clear, at least to me. Could you please be more specific in saying what this parameter controls exactly? From what we observed it seems that for default value (which seems to be 1), the refined levels levels are very tightly nested. Setting a larger value increases the "gap" between the levels, results in much broader refined regions.

There seems to be a limit however to the value one can see.

P=0000000:Program abort called in file SAMRAI/hier/OverlapConnectorAlgorithm.C'' at line 1080 P=0000000:ERROR MESSAGE: P=0000000:OverlapConnectorAlgorithm::privateBridge_prologue input error: P=0000000:The given connector width limit, (10,10) ((10,10) in finest index space) P=0000000:is not <= the maximum width of the bridge, (8,8) (in finest index space). 

In the case we have Nghost=1 ghost cell for our field data, coarsening stencil width of 2 cells.

Thanks

nicolasaunai avatar Dec 03 '21 13:12 nicolasaunai

The tag_buffer value causes additional cells to be tagged around the cells that are explicitly tagged according to the tagging criteria. With a tag_buffer value of 1 on a 2D mesh, if a single cell is explicitly tagged, all 8 of its surrounding cells will also be tagged due to the tag buffer. The intent is to create a way to keep some refined space between the features of interest that trigger the tags and the edge of the PatchLevel.

It appears you may have exposed a bug in this process. We would need a reproducer to look into this.

The tag_buffer is not specifically intended to control the nesting space between finer levels, though it can have that effect. There is another parameter, proper_nesting_buffer in PatchHierarchy, that can control the minimum nesting between levels.

nselliott avatar Dec 08 '21 17:12 nselliott

I'm coming back at this issue because we still have it and now would like to test larger tag buffers. To reproduce the issue,

you can use this DockerFile that pre-configure all dependencies :

https://github.com/PHARCHIVE/phare-teamcity-agent/blob/master/Dockerfile

mkdir /path/to/test/dir ; cd  /path/to/test/dir
git clone https://github.com/PHAREHUB/PHARE
mkdir build ; cd build
 cmake -DCMAKE_CXX_FLAGS="-g3 -O3 -DPHARE_DIAG_DOUBLES=1" -DdevMode=OFF  ../PHARE
make -j

then you need to set the python path to

export PYTHONPATH=/path/to/test/dir/build: /path/to/test/dir/PHARE/pyphare/

then save this into a file named kh.py

#!/usr/bin/env python3

import pyphare.pharein as ph #lgtm [py/import-and-import-from]
from pyphare.pharein import Simulation
from pyphare.pharein import MaxwellianFluidModel
from pyphare.pharein import ElectromagDiagnostics,FluidDiagnostics
from pyphare.pharein import ElectronModel
from pyphare.simulator.simulator import Simulator, startMPI

from pyphare.pharein import global_vars as gv

import numpy as np
import matplotlib.pyplot as plt
import matplotlib as mpl
mpl.use('Agg')


diag_outputs="."
out="."

from datetime import datetime



def density(x, y):
    return 1.


def S(y, y0, l):
    return 0.5*(1. + np.tanh((y-y0)/(l)))


def by(x, y):
    return 0.


def bx(x, y):
    return 0.


def bz(x, y):
    return 1.


def b2(x, y):
    return bx(x,y)**2 + by(x, y)**2 + bz(x, y)**2


def T(x, y):
    K = 1
    temp = 1./density(x, y)*(K - b2(x, y)*0.5)
    assert np.all(temp >0)
    return temp

def vx(x, y):
    from pyphare.pharein.global_vars import sim
    y0 = gv.sim.simulation_domain()[0]
    amp = 0.1
    k = 7
    x0 = y0*0.25
    x1 = y0*0.75
    v0 = amp*np.sin(((2*np.pi)/y0)*y*k)*np.exp(-(((x-x0)**2)/2))
    v1 = amp*np.sin(((2*np.pi)/y0)*y*k)*np.exp(-(((x-x1)**2)/2))
    return(v0 + v1)

def vy(x, y):
    from pyphare.pharein.global_vars import sim
    x0 = gv.sim.simulation_domain()[0]
    v1 = -0.5
    v2 = 0.5
    return(v1 + (v2-v1)*(S(x,x0*0.25,0.5) - S(x,x0*0.75,0.5)))

def vz(x, y):
    return 0.

def vthx(x, y):
    return 0.2

def vthy(x, y):
    return 0.2

def vthz(x, y):
    return 0.2


def config():

    Simulation(
        time_step_nbr=200,
        time_step=0.005,
        #boundary_types="periodic",
        cells=(200,200),
        dl=(0.4, 0.4),
        refinement="tagging",
        max_nbr_levels = 3,
        tag_buffer = 10,
        #smallest_patch_size = 20,
        nesting_buffer=1,
        refinement_boxes={},
        hyper_resistivity=0.005,
        resistivity=0.001,
        diag_options={"format": "phareh5",
                      "options": {"dir": diag_outputs,
                                  "mode":"overwrite"}},
        strict=True,
    )


    vvv = {
        "vbulkx": vx, "vbulky": vy, "vbulkz": vz,
        "vthx": vthx, "vthy": vthy, "vthz": vthz,
        "nbr_part_per_cell":100
    }

    MaxwellianFluidModel(
        bx=bx, by=by, bz=bz,
        protons={"charge": 1, "density": density,  **vvv},

    )

    ElectronModel(closure="isothermal", Te=0.0)

    sim = ph.global_vars.sim
    dt = 100*sim.time_step
    nt = sim.final_time/dt+1
    timestamps = (dt * np.arange(nt))

    for quantity in ["E", "B"]:
        ElectromagDiagnostics(
            quantity=quantity,
            write_timestamps=timestamps,
            compute_timestamps=timestamps,
        )


    for quantity in ["density", "bulkVelocity"]:
        FluidDiagnostics(
            quantity=quantity,
            write_timestamps=timestamps,
            compute_timestamps=timestamps,
            )



def main():
    config()
    s = Simulator(gv.sim)
    s.initialize()
    s.run()


if __name__=="__main__":
    main()        

and run :

cd /path/to/test/dir/
python3 kh.py

in the above kh.py file, there is a line where you can see tag_buffer = 10,, the fails and gives the error as described in the first message. setting the parameter to 8 works.

nicolasaunai avatar Jul 11 '22 12:07 nicolasaunai

I have identified the bug, and I am figuring out the fix for it.

nselliott avatar Jul 18 '22 22:07 nselliott

#208 is the bug fix to address this.

nselliott avatar Jul 25 '22 16:07 nselliott