GEOS icon indicating copy to clipboard operation
GEOS copied to clipboard

feat: Adaptive multilinear interpolation for OBL solver

Open av-novikov opened this issue 1 year ago • 10 comments

This PR adds interfaces for embedding of Python function into C++ code, adds multilinear adaptive interpolation with given exact evaluator from Python and adds the support of both static and adaptive interpolation in ReactiveCompositionalMultiphaseOBLKernels. Ultimate goal is to run simulation similar to open-darts from a corresponding (part of) model given from Python.

  • [x] fix assembly in ReactiveCompositionalMultiphaseOBLKernels
  • [x] multilinear adaptive interpolation
  • [x] interfaces to call Python function from C++ (from interpolator)
  • [x] convergence test for multilinear adaptive interpolation
  • [x] generalize input of OBL fluid physics for static and adaptive interpolation
  • [x] pass Python-based OBL operators evaluator to solver
  • [x] use LvArray/src/python/PyFunc for embedding Python function to C++
  • [x] deduce access level in LvArray::python::PyArrayWrapper to support MODIFIABLE arguments: LvArray: PR 336
  • [x] unify casting-to-numpy interfaces between GEOS and open-darts: open-darts: MR 138
  • [x] Python interfaces to GEOS from Makutu repository: geosPythonPackages: PR 74
  • [x] Examples - 2ph_comp and carbonated_water (with PHREEQC-backed geochemistry) have been moved to geosPythonPackages: PR 74

FAQ:

Q: Why do we use __uint128_t ? A: We need to count nodes and hypercubes in multidimensional state space. Even with 300 points per axis in 8-dim space (e.g. 8-component fluid), the number of points (300^8) surpasses the maximum of 64-bit integer. Moreover, single __uint128_t key type simplifies hashing points and hypercubes in std::unordered_map storage.

Q: Do we duplicate the storage of points? A: Yes, because we minimize memory accesses by excessive, but consecutive storage.

av-novikov avatar Oct 08 '24 16:10 av-novikov

Thanks @av-novikov for this PR! I think that the python scripts shouldn't go in a subfolder of compositionalMultiphaseFlow, but in a higher-level folder (maybe in a new folder scripts/pygeos?). We'll be adding the remaining Makutu scripts in the (possbly near) future and they'll be used for geophysics and potentially geomechanics workflows. All these base python classes and utilities need to be common to these other solvers as well. @CusiniM @rrsettgast what's your opinion?

sframba avatar Oct 18 '24 09:10 sframba

Thanks @av-novikov for this PR! I think that the python scripts shouldn't go in a subfolder of compositionalMultiphaseFlow, but in a higher-level folder (maybe in a new folder scripts/pygeos?). We'll be adding the remaining Makutu scripts in the (possbly near) future and they'll be used for geophysics and potentially geomechanics workflows. All these base python classes and utilities need to be common to these other solvers as well. @CusiniM @rrsettgast what's your opinion?

Thank you @sframba ! I moved them into scripts/pygeosx.

av-novikov avatar Oct 21 '24 14:10 av-novikov

Thanks @av-novikov for this PR! I think that the python scripts shouldn't go in a subfolder of compositionalMultiphaseFlow, but in a higher-level folder (maybe in a new folder scripts/pygeos?). We'll be adding the remaining Makutu scripts in the (possbly near) future and they'll be used for geophysics and potentially geomechanics workflows. All these base python classes and utilities need to be common to these other solvers as well. @CusiniM @rrsettgast what's your opinion?

I think we should put all of these scripts in the python tools repository.

CusiniM avatar Oct 23 '24 16:10 CusiniM

Thanks @av-novikov for this PR! I think that the python scripts shouldn't go in a subfolder of compositionalMultiphaseFlow, but in a higher-level folder (maybe in a new folder scripts/pygeos?). We'll be adding the remaining Makutu scripts in the (possbly near) future and they'll be used for geophysics and potentially geomechanics workflows. All these base python classes and utilities need to be common to these other solvers as well. @CusiniM @rrsettgast what's your opinion?

I think we should put all of these scripts in the python tools repository.

@CusiniM @sframba Should I place interfaces in different folders in geosPythonPackages repo according to their purposes, e.g. input -> geos-xml-tools, mesh -> geos-mesh ? Or should i place everything to the same folder, e.g. makutu-utilities?

av-novikov avatar Oct 24 '24 09:10 av-novikov

Thanks @av-novikov for this PR! I think that the python scripts shouldn't go in a subfolder of compositionalMultiphaseFlow, but in a higher-level folder (maybe in a new folder scripts/pygeos?). We'll be adding the remaining Makutu scripts in the (possbly near) future and they'll be used for geophysics and potentially geomechanics workflows. All these base python classes and utilities need to be common to these other solvers as well. @CusiniM @rrsettgast what's your opinion?

I think we should put all of these scripts in the python tools repository.

@CusiniM @sframba Should I place interfaces in different folders in geosPythonPackages repo according to their purposes, e.g. input -> geos-xml-tools, mesh -> geos-mesh ? Or should i place everything to the same folder, e.g. makutu-utilities?

I think we should keep them in the same package/folder in geosPythonPackages repo. That way we just have to do one pip install to get the python package up and running. Two ways we could go for:

  • create a new package called pygeos-handler which indicates that the package is designed to manipulate GEOS (inputs, solvers, etc.).
  • reuse the existing package pygeos-tools which is a more generic tooling package for interacting with pygeos

@CusiniM @sframba any preference/suggestion ?

Bubusch avatar Oct 28 '24 16:10 Bubusch

Dear @Bubusch @sframba @CusiniM Python interfaces are going to be placed in pygeos-tools in geosPythonPackages: PR 44. Hope it works to you.

av-novikov avatar Oct 30 '24 09:10 av-novikov

@dkachuma @paveltomin can you help review this PR? Thanks.

herve-gross avatar Apr 25 '25 19:04 herve-gross

@av-novikov the following patch should fix the issues with MGR

diff --git a/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp b/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
index ee2561ad4..172aa06e4 100644
--- a/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
+++ b/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
@@ -68,8 +68,8 @@ public:
 
     m_levelFRelaxType[0]          = MGRFRelaxationType::none;
     m_levelInterpType[0]          = MGRInterpolationType::injection;
-    m_levelRestrictType[0]        = MGRRestrictionType::injection;
-    m_levelCoarseGridMethod[0]    = MGRCoarseGridMethod::cprLikeBlockDiag;
+    m_levelRestrictType[0]        = MGRRestrictionType::blockColLumped; // True-IMPES
+    m_levelCoarseGridMethod[0]    = MGRCoarseGridMethod::galerkin;
     m_levelGlobalSmootherType[0]  = MGRGlobalSmootherType::ilu0;
     m_levelGlobalSmootherIters[0] = 1;
   }

victorapm avatar Aug 22 '25 21:08 victorapm

@av-novikov the following patch should fix the issues with MGR

diff --git a/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp b/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
index ee2561ad4..172aa06e4 100644
--- a/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
+++ b/src/coreComponents/linearAlgebra/interfaces/hypre/mgrStrategies/ReactiveCompositionalMultiphaseOBL.hpp
@@ -68,8 +68,8 @@ public:
 
     m_levelFRelaxType[0]          = MGRFRelaxationType::none;
     m_levelInterpType[0]          = MGRInterpolationType::injection;
-    m_levelRestrictType[0]        = MGRRestrictionType::injection;
-    m_levelCoarseGridMethod[0]    = MGRCoarseGridMethod::cprLikeBlockDiag;
+    m_levelRestrictType[0]        = MGRRestrictionType::blockColLumped; // True-IMPES
+    m_levelCoarseGridMethod[0]    = MGRCoarseGridMethod::galerkin;
     m_levelGlobalSmootherType[0]  = MGRGlobalSmootherType::ilu0;
     m_levelGlobalSmootherIters[0] = 1;
   }

Dear @victorapm,

Thank you very much for your patch! I tried to run under m_levelRestrictType[0] = MGRRestrictionType::blockColLumped; m_levelCoarseGridMethod[0] = MGRCoarseGridMethod::galerkin;

and it helped but not solved the problem.

Now we have 1D homogeneous model with injection of carbonated water converging, although spending many linear iterations (LI) per newton iteration NI: 10-60, avg-45. The similar 2D 100x100 heterogeneous setup starts with 15-40 LI/NI but then relatively quickly rise till maximum value of 200 LI/NI. In the end it takes too much time and i decided to stop simulation. Before we had both 1D and 2D setups non-converging, i.e. 200 LI were not enough to reduce error.

1D setup, log without your patch - output_1d_verbose.txt 1D setup, log with your patch - output_victor_patch_1d_verbose.log 2D setup, log without your patch - output_2d_verbose.txt 2D setup, log with your patch - output_victor_patch_2d_verbose.log

I can share hypre output with matrices if needed. Please let me know how should i proceed?

Best, Aleks

av-novikov avatar Aug 29 '25 13:08 av-novikov

Dear @victorapm,

Indeed merge of develop branch with recently added scaling option, accompanied by the fix you suggested helped to improve performance of linear solver. Now it takes much less iterations and both 1D homogeneous and 2D heterogeneous setups converge smoothly. I attached logs for both.

output_merge_1d.log output_merge_2d.log

Thank you very much!

av-novikov avatar Sep 12 '25 15:09 av-novikov