Adding bpar to stella, explicit and implicit algorithms
This pull request is intended to deliver a fully functional implementation of the explicit and implicit algorithms using all three fields, phi, apar, and bpar. Starting from the development/apar2 branch where apar is included by introducing the distribution function gbar = g + 2 (Zs/Ts) vths vpa J0 apar, and g = h - Zs J0 phi / Ts, we include bpar. Bpar is included by modifying g to g = h - (Zs/Ts)(J0 phi + 4 (J1/bs) mu (Ts/Zs) bpar).
For the explicit algorithm standard flux tube features are supported (tested linear, collisionless simulations).
For the implicit algorithm collisionless features are supported (tested linear, and nonlinear collisionless simulations).
Initial low-resolution nonlinear simulations have been performed with promising results (no blow-up!).
Linear, collisionless, benchmarks have been carried out against GS2 for an ITG and a KBM example.
Not supported
-
perpendicular flow shear for simulations with A|| (The implicit implementation of perpendicular flow shear acts on g, not gbar, meaning that there are (potentially, probably) missing flow-shear terms involving A||). B|| terms are included in the evolved g in the implicit algorithm so there are no missing flow shear terms for B||
-
radial variation
-
full flux surface
-
implicit collisions
-
driftkinetic_implicit
-
neoclassical terms
-
in-code comments: many places refer to g as g = < f >, which is no longer true.
-
[ ] Support the implicit algorithm for flow shear and A||
-
[ ] Support the implicit algorithm for collisions.
-
[x] Provide electromagnetic fluxes in the diagnostics (are separate variables preferred, or should the existing flux be upgraded to the total one? @mabarnes @GeorgiaActon)
-
[ ] Provide in-code documentation.
-
[ ] Provide benchmarks and report documentation.
-
[ ] (Optional extra) Support neoclassical terms for apar and bpar (@mabarnes where are the equations written down?).
-
[ ] (Optional extra) Support radial variation.
-
[ ] (Optional extra) Support full flux surface (@GeorgiaActon I see you have another open PR, so perhaps this is a non-trivial job).
collisionless_kbm_explicit.zip collisionless_kbm_implicit.zip collisionless_kbm_driftsimplicit.zip collisionless_mtm_benchmark_explicit.zip collisionless_mtm_implicit.zip
Example input files for a KBM benchmark with the various different algorithm options, including an attempt to find a collisionless MTM with only phi and apar used. Note that the agreement in the MTM case is not of the same order as for the KBM, requiring further investigation. In the case of the MTM, if 'drifts_implicit = .true.' is used then the system goes numerically unstable to a grid scale instability (with amplitude dominantly at the ends of the domain in zed) after 750 time steps.
Thus far, I have tested four linear input files, where the listed properties below are what I have changed:
- electrostatic, master stella version, drifts implicit
- electrostatic, PR 129 stella version, drifts implicit
- electrostatic, PR 129 Stella version, drifts, stream, mirror implicit
- fully electromagnetic, PR 129 Stella version, drifts, stream, mirror implicit
1), 2), and 3) have excellent agreement. 4) also converges.
Next, I will look into collisionless MTM simulations as @mrhardman was having difficulty getting agreement with gs2, indicating a possible issue with apar.
Michael, did you try increasing nperiod for the collisionless MTM simulations? I will attempt this if I find disagreement b/w gs2 and Stella.
On Jan 2, 2024, at 8:13 AM, mrhardman @.***> wrote:
collisionless_kbm_explicit.zip https://github.com/stellaGK/stella/files/13810233/collisionless_kbm_explicit.zip collisionless_kbm_implicit.zip https://github.com/stellaGK/stella/files/13810234/collisionless_kbm_implicit.zip collisionless_mtm_benchmark_explicit.zip https://github.com/stellaGK/stella/files/13810235/collisionless_mtm_benchmark_explicit.zip collisionless_kbm_driftsimplicit.zip https://github.com/stellaGK/stella/files/13810236/collisionless_kbm_driftsimplicit.zip Example input files for a KBM benchmark with the various different algorithm options, including an attempt to find a collisionless MTM with only phi and apar used. Note that the agreement in the MTM case is not of the same order as for the KBM, requiring further investigation.
— Reply to this email directly, view it on GitHub https://github.com/stellaGK/stella/pull/129#issuecomment-1874010992, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGBHCNHLETUTIEU4LWIF5MDYMQBX7AVCNFSM6AAAAABBID2ZSCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZUGAYTAOJZGI. You are receiving this because your review was requested.
Confirmed that nperiod = 4 and nperiod = 7 did not make a difference in the growth rate for Stella. I wonder if there are other knobs.
On Jan 2, 2024, at 3:56 PM, Jason Parisi @.***> wrote:
Michael, did you try increasing nperiod for the collisionless MTM simulations? I will attempt this if I find disagreement b/w gs2 and Stella.
On Jan 2, 2024, at 8:13 AM, mrhardman @.***> wrote:
collisionless_kbm_explicit.zip https://github.com/stellaGK/stella/files/13810233/collisionless_kbm_explicit.zip collisionless_kbm_implicit.zip https://github.com/stellaGK/stella/files/13810234/collisionless_kbm_implicit.zip collisionless_mtm_benchmark_explicit.zip https://github.com/stellaGK/stella/files/13810235/collisionless_mtm_benchmark_explicit.zip collisionless_kbm_driftsimplicit.zip https://github.com/stellaGK/stella/files/13810236/collisionless_kbm_driftsimplicit.zip Example input files for a KBM benchmark with the various different algorithm options, including an attempt to find a collisionless MTM with only phi and apar used. Note that the agreement in the MTM case is not of the same order as for the KBM, requiring further investigation.
— Reply to this email directly, view it on GitHub https://github.com/stellaGK/stella/pull/129#issuecomment-1874010992, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGBHCNHLETUTIEU4LWIF5MDYMQBX7AVCNFSM6AAAAABBID2ZSCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZUGAYTAOJZGI. You are receiving this because your review was requested.
Brief experimentation with the input file for the implicitly treated MTM above suggests that the apparent bug with drifts_implicit =.true. is in fact caused by a bug with the stream_implicit = .true. algorithm. By setting omega* = 0 (fprim = tprim = 0), include_apar = include_bpar = .false., and by manually setting cvdrift = gbdrift = gbdrift0 = cvdrift0 = 0 in stella_geometry.f90, and by setting delt = 0.1, aky = 0.005 I can find a numerical instability. I would guess that since we normally use an automatic CFL constraint from the drifts that using drifts_implicit = .false. masks the numerical instability by reducing the time step size.
I note that in the input file once there are no drifts or drives, one can almost kill the instability by setting zed_upwind = 1.0. However, using periodic boundary conditions makes the system numerically unstable even with zed_upwind = 1.0.
Overall, I think further investigations in to the implicit algorithm should focus on the basic electrostatic slab example without drifts or drives, to try to clarify where the underlying bug might be.
Brief experimentation with the input file for the implicitly treated MTM above suggests that the apparent bug with drifts_implicit =.true. is in fact caused by a bug with the stream_implicit = .true. algorithm. By setting omega* = 0 (fprim = tprim = 0), include_apar = include_bpar = .false., and by manually setting cvdrift = gbdrift = gbdrift0 = cvdrift0 = 0 in stella_geometry.f90, and by setting delt = 0.1, aky = 0.005 I can find a numerical instability. I would guess that since we normally use an automatic CFL constraint from the drifts that using drifts_implicit = .false. masks the numerical instability by reducing the time step size.
I note that in the input file once there are no drifts or drives, one can almost kill the instability by setting zed_upwind = 1.0. However, using periodic boundary conditions makes the system numerically unstable even with zed_upwind = 1.0.
Overall, I think further investigations in to the implicit algorithm should focus on the basic electrostatic slab example without drifts or drives, to try to clarify where the underlying bug might be.
You can turn off the drifts without changing the source code by using the input flags:
&time_advance_knobs xdriftknob = 0.0 ydriftknob = 0.0 wstarknob = 0.0
You can also turn streaming and mirror terms off using:
&physics_flags include_parallel_streaming = .False. include_mirror = .False.
Brief experimentation with the input file for the implicitly treated MTM above suggests that the apparent bug with drifts_implicit =.true. is in fact caused by a bug with the stream_implicit = .true. algorithm. By setting omega* = 0 (fprim = tprim = 0), include_apar = include_bpar = .false., and by manually setting cvdrift = gbdrift = gbdrift0 = cvdrift0 = 0 in stella_geometry.f90, and by setting delt = 0.1, aky = 0.005 I can find a numerical instability. I would guess that since we normally use an automatic CFL constraint from the drifts that using drifts_implicit = .false. masks the numerical instability by reducing the time step size.
I note that in the input file once there are no drifts or drives, one can almost kill the instability by setting zed_upwind = 1.0. However, using periodic boundary conditions makes the system numerically unstable even with zed_upwind = 1.0.
Overall, I think further investigations in to the implicit algorithm should focus on the basic electrostatic slab example without drifts or drives, to try to clarify where the underlying bug might be.
Just to check: (1) with this example is the scheme the same as for the master branch and (2) does the master branch also show the same instability?
I remember in my implementation, I also changed the streaming algorithm (so that it updated h, the non-adiabatic piece, rather than g), and this made the electrostatic slab case less stable (for reasons I never managed to identify).
Today with @GeorgiaActon we experimented with the MTM benchmark above to determine what caused the numerical instability. We found that for ky = 0.5, the instability was present with magnetic drifts 'ydriftknob = 1.0', when 'xdriftknob = wstarknob = 0.0'. Increasing ky made the instability grow faster, reducing ky reduced the growth rate of the instability. Reducing ky too far, however, triggered the numerical instability associated with parallel streaming. Input files and text output showing what was attempted is attached. stella_numerical_instability_notes.zip
Looking specifically at the low ky instability without drives or drifts with the master branch, commit 4cdc5fcd34a6bd9252873e2542fcd1c8c4636dff, reveals the same instability as observed in the development branch in this PR. stella_numerical_instability_master.zip
Brief experimentation with the input file for the implicitly treated MTM above suggests that the apparent bug with drifts_implicit =.true. is in fact caused by a bug with the stream_implicit = .true. algorithm. By setting omega* = 0 (fprim = tprim = 0), include_apar = include_bpar = .false., and by manually setting cvdrift = gbdrift = gbdrift0 = cvdrift0 = 0 in stella_geometry.f90, and by setting delt = 0.1, aky = 0.005 I can find a numerical instability. I would guess that since we normally use an automatic CFL constraint from the drifts that using drifts_implicit = .false. masks the numerical instability by reducing the time step size. I note that in the input file once there are no drifts or drives, one can almost kill the instability by setting zed_upwind = 1.0. However, using periodic boundary conditions makes the system numerically unstable even with zed_upwind = 1.0. Overall, I think further investigations in to the implicit algorithm should focus on the basic electrostatic slab example without drifts or drives, to try to clarify where the underlying bug might be.
Just to check: (1) with this example is the scheme the same as for the master branch and (2) does the master branch also show the same instability?
I remember in my implementation, I also changed the streaming algorithm (so that it updated h, the non-adiabatic piece, rather than g), and this made the electrostatic slab case less stable (for reasons I never managed to identify).
In this PR we use the same implementation as in master -> we write the RHS of the equation in terms of a distribution function called 'g' such that there are no time derivatives of phi or bpar on the RHS. This is the same 'g' as solved for in GS2. To avoid the time derivative in apar, we switch to and from the distribution function 'gbar = h - Ze
Is there a knob for rescaling the parallel streaming magnitude? Can you trigger numerical instability when y_driftknob << 1 in addition to x_driftknob = wstarknob = 0.? Maybe it’s a v_magnetic k_perp / k_{\parallel} v_{ts} threshold effect — Michael's MTM explicit benchmark shows the real eigenmodes were in decent agreement for some theta values, but generally worsened at higher |theta| values. Maybe agreement was better when v_magnetic k_perp / k_{\parallel} v_{ts}) was larger (relatively faster drifts)?
On Jan 8, 2024, at 12:56 PM, mrhardman @.***> wrote:
Brief experimentation with the input file for the implicitly treated MTM above suggests that the apparent bug with drifts_implicit =.true. is in fact caused by a bug with the stream_implicit = .true. algorithm. By setting omega* = 0 (fprim = tprim = 0), include_apar = include_bpar = .false., and by manually setting cvdrift = gbdrift = gbdrift0 = cvdrift0 = 0 in stella_geometry.f90, and by setting delt = 0.1, aky = 0.005 I can find a numerical instability. I would guess that since we normally use an automatic CFL constraint from the drifts that using drifts_implicit = .false. masks the numerical instability by reducing the time step size. I note that in the input file once there are no drifts or drives, one can almost kill the instability by setting zed_upwind = 1.0. However, using periodic boundary conditions makes the system numerically unstable even with zed_upwind = 1.0. Overall, I think further investigations in to the implicit algorithm should focus on the basic electrostatic slab example without drifts or drives, to try to clarify where the underlying bug might be.
Just to check: (1) with this example is the scheme the same as for the master branch and (2) does the master branch also show the same instability?
I remember in my implementation, I also changed the streaming algorithm (so that it updated h, the non-adiabatic piece, rather than g), and this made the electrostatic slab case less stable (for reasons I never managed to identify).
In this PR we use the same implementation as in master -> we write the RHS of the equation in terms of a distribution function called 'g' such that there are no time derivatives of phi or bpar on the RHS. This is the same 'g' as solved for in GS2. To avoid the time derivative in apar, we switch to and from the distribution function 'gbar = h - Ze F0/Ts'. We do the time advance in 'gbar'.
— Reply to this email directly, view it on GitHub https://github.com/stellaGK/stella/pull/129#issuecomment-1881568511, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGBHCNCDQLAV7YN47WT3BLTYNQXOBAVCNFSM6AAAAABBID2ZSCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQOBRGU3DQNJRGE. You are receiving this because your review was requested.
@mabarnes @GeorgiaActon The last commits add the bpar contribution to the particle and heat fluxes, but the contribution to the perpendicular part of the momentum flux is not included. To support this stella would need to have the Bessel function d J'_1(x) /d x = (1/2) (J0(x) - J2(x)).
Notes on numerical instability:
-
Including parallel streaming in isolation seems stable, including mirror in isolation seems stable, but including them both without the presence of drifts can be unstable.
-
This numerical instability is worse with increasing dt, decreasing k_y, or decreasing qinp.
-
It seems that these parameters are all knobs for the relative amplitudes of the streaming and mirror terms compared to the time derivative.
-
Turning off the phi contribution in the streaming removes the instability. However, keeping the phi contribution and reducing the amplitude of the g contribution from streaming kills the instability. The reverse is also true; increasing the amplitude of the g contribution to the streaming equation worsens the instability.
Notes on numerical instability:
A further note that all of these experiments were done with zed_upwind = 1.0, demonstrating that even maximum first-order upwinding does not guarantee a stable simulation.
An additional note:
Examining the collisionless_mtm_implicit.zip example above, using drifts_implicit = .true. makes the example numerically unstable. This might be consistent with the fact that the implicit algorithm cannot accept a timestep size too large. However, this instability is not repaired by setting include_mirror = .false.. This may suggest that the electromagnetic implementation has addition bugs compared to the electrostatic implementation @GeorgiaActon @mabarnes.
Another check for the instability: turning off the drifts with
&time_advance_knobs
xdriftknob = 0.0
ydriftknob = 0.0
wstarknob = 0.0
removes the time step restraint from the automatic CFL condition and appears to give a stable simulation for the nominal time step (the automatic CFL otherwise reduces the timestep size to 1.0e-3 from 1.0e-2 in the input file).
@Scottyujia @mabarnes I have just noticed that the perpendicular flow shear terms connected to apar are likely missing, so perpendicular flow shear is likely not implemented properly in the electromagnetic case. I have updated the tracker at the top of this PR.
example_nonlinear_phiaparbpar_2.zip Example of a very low resolution nonlinear electromagnetic simulation in cyclone base case geometry. This case is definitely under-resolved but shows successful execution of the code. Made with
commit ee47b7ce6c0dacfe00cb165cf7582de1df4b9855 (HEAD -> development/apar2plusbpar, origin/development/apar2plusbpar)
and taking ~1.5 CPU hrs to execute on 16 cores (5.60 mins). Perhaps suitable for "long" automatic tests.
Plots are made with
python3 xarray_post_processing/plot_stella.py example_nonlinear_phiaparbpar_2