aspect icon indicating copy to clipboard operation
aspect copied to clipboard

[WIP] Use correct stresses in visco-elasto-plastic rheology

Open anne-glerum opened this issue 3 years ago • 25 comments

This PR is at the moment for discussing with everybody interested the changes to the elasticy and visco-plastic plugins that @bobmyhill, @naliboff and I discussed yesterday. @dansand, your input would be great to have!

FYI @EstherHeck

For all pull requests:

  • [ ] I have followed the instructions for indenting my code.

For new features/models or changes of existing features:

  • [ ] I have tested my new feature locally to ensure it is correct.
  • [ ] I have created a testcase for the new feature/benchmark in the tests/ directory.
  • [ ] I have added a changelog entry in the doc/modules/changes directory that will inform other users of my change.

anne-glerum avatar Sep 30 '21 11:09 anne-glerum

Thanks for the comments @naliboff , I've addressed them where possible. Maybe @bobmyhill , @jdannberg or @gassmoeller can also give this a look?

anne-glerum avatar Nov 09 '21 15:11 anne-glerum

1. Can the stress_component_statistics still be removed, per a comment in the .cc file? If it does need to be included, I will review that as well.

I think the statistics postprocessor could still be useful, as it gives the statistics of the full stress at time t, instead of the the rotated and advected, but not updated, stress at time t. I've changed the comments to reflect this.

2. Are the updates to stress.cc implementing the same changes as proposed in #4375?

No. The changes in #4375 are valid for the VEP implementation in main, the changes in stress.cc are for the VEP implementation in this PR.

anne-glerum avatar Jun 28 '22 08:06 anne-glerum

@anne-glerum I tried to have a look at this PR, but you have a lot of unrelated changes in your branch (see https://github.com/geodynamics/aspect/pull/4370/files). It looks like you may have merged the 2.4.0 release branch, or some other changes. Can you rebase and try to remove the unrelated changes from your branch?

gassmoeller avatar Apr 13 '23 13:04 gassmoeller

Hi @gassmoeller , yes I can see that makes reviewing very complicated. I rebased to the release because that is the version we wanted to use for the paper.

I can make a copy of this branch (based on the release) and then rebase this PR to main. Would it be okay if I squash my commits before I rebase? I remember rebasing to the release being very laborious because of a multitude of commits giving merge conflicts.

Alternatively, I update the copy of this branch I still have from before the rebase to the release with the changes I have made in the meantime and then rebase that to main. I'd then still prefer to squash the commits first. I'm not sure what is preferable or whether it makes a difference.

anne-glerum avatar Apr 17 '23 09:04 anne-glerum

I can make a copy of this branch (based on the release) and then rebase this PR to main. Would it be okay if I squash my commits before I rebase?

Yes, I think that is the best approach. Keep a copy based on the release (for reference in the paper), but rebase to main so that we can merge it. Squashing the commits is fine.

gassmoeller avatar Apr 17 '23 13:04 gassmoeller

I can make a copy of this branch (based on the release) and then rebase this PR to main. Would it be okay if I squash my commits before I rebase?

Yes, I think that is the best approach. Keep a copy based on the release (for reference in the paper), but rebase to main so that we can merge it. Squashing the commits is fine.

Okay, thanks. I will do that and let you know when I'm done :)

anne-glerum avatar Apr 18 '23 09:04 anne-glerum

Hi @gassmoeller, I have rebased to main. I also ran the third benchmark (2D square under simple shear for half of the model time) again; I summarize the results below. Maybe you could keep the issues listed below in mind when going through the code.

  1. Results are independent of resolution for both fields and particles. 👍
  2. Results are independent of the number of nonlinear iterations allowed for fields. 👍
  3. The error during shearing decreases for both fields and particles with smaller timesteps. 👍
  4. The step in error when shearing is stopped decreasing for smaller timesteps for both fields and particles 👍 , but only up to a timestep size of 0.0025 s. 👎 3_viscoelastic_build-up_simple_shear_particles_dtc_isnot_dte_SS_dt
  5. After shearing has stopped, the particle runs use all the allowed nonlinear iterations (NI). The more iterations are allowed, the larger the error. 👎 3_viscoelastic_build-up_simple_shear_particles_dtc_isnot_dte_SS_NI
  6. The error is not exactly the same between fields and particles 👎 , but the difference decreases with timestep size 👍 . After shearing has stopped, the difference is largely due to the step in error for particle runs. 3_viscoelastic_build-up_simple_shear_fields_particles_dtc_isnot_dte_SS_test_dt_GR2

anne-glerum avatar Jun 05 '23 13:06 anne-glerum

ad. 6. Could one difference between the particle and the field runs lie in the reaction_term? It computes the spin tensor W from the solution of the previous timestep, but for particles this solution is evaluated at the new location of the particles I think? But this shouldn't matter, as the spin tensor is constant over the domain.

ad. 5. The reaction_term used to update the stresses on the particles changes linearly over the NI within one timestep instead of becoming constant. The figure shows the reaction_term value for a specific particle for a run with 100 NI allowed from the moment shearing is stopped . 3_vexy_R

EDIT: Actually, the composition property plugin also applies its update every nonlinear iteration, increasing the stress value in every nonlinear iteration. This happens before the elastic_stresses property plugin requests the reaction term. The composition plugin is only needed to apply the update from the operator splitting to the particle properties, so it should only act once per timestep.

EDIT 2: Only applying the composition property plugin during the first nonlinear iteration leads to a slower decrease in the stresses after shearing has stopped that leads to a large error.

anne-glerum avatar Jun 05 '23 14:06 anne-glerum

Hi @anne-glerum, I think I understand the problem (after about 2 hours of reconstructing the order of operations :smile:) and have an idea for a fix, but it is a bit complex: Problem: By using the combination of particle properties composition and elastic_stress, and interpolating the resulting property back into a compositional field, we are circumventing the particle reset that happens at the start of every nonlinear iteration. Here is why: At the start of a NI, particle properties are reset. Then the particle property stress is overwritten from the solution vector in the composition property plugin. Then the elastic_stress plugin modifies that value. Then that value is interpolated back to the field into the solution vector. Now the cycle starts again, we reset particles, but then overwrite the properties with the solution vector (which was not reset). Therefore stress is infinitely accumulated in every iteration. Solution: I think this will be tricky, I dont see an easy solution. We cannot limit updating the particle properties from the field to the first nonlinear iteration, because in subsequent NI's the particles will be reset to their original values from the previous timestep again, so we would be missing out on the update that happened in the operator splitting step. I think we need to discuss three avenues and decide which one is most promising:

  1. break the cycle of constant update somehow, e.g. by letting the elastic_stress particle property have its own set of properties (instead of reusing composition). I dont quite see if this will work, but it feels like it should be possible somehow.
  2. Make operator splitting work within nonlinear iterations. Then we could move the operator splitting step into the loop, and presumably would reset the fields as well at the start of each NI (then we could use our current workflow, and updates would not accumulate over NIs).
  3. Implement operator splitting on particles. We dont really need a full operator splitting, just a possibility to update particle properties before they are advected (e.g. via a signal). Then we could do the work we currently do in the operator splitting step directly in the particle property, and that would mean we dont need to update particle properties from the fields (and therefore the reset at the beginning of the NI would work correctly). This would mean elastic stress would be completely stored and computed on particles, and the compositional field would be a read-only convenience copy for the Stokes equation only.

Each solution is a bit tricky, but i think we need one of them. I prefer 2 or 3, because they also add additional functionality to ASPECT, and are not limited to our current problem. Maybe something we need to work on during the hackathon?

Could you try the problematic benchmark with single Advection, iterated Stokes? It may not be very accurate, but the solution for particles and fields should be much closer together?

gassmoeller avatar Jun 07 '23 21:06 gassmoeller

Ohhh brilliant @gassmoeller! Thank you for taking the time to figure this out!

Concerning option 3, we would need access to the old_solution at the location of particles to compute the operator splitting update, instead of the solution the plugins get now, but that's easily done.

It seems a bit like duplicating functionality (having an operator splitting for fields and for particles), but completely storing and computing stresses on the particles is a clean option. And, like you say, it's an update to a property value, it doesn't need the additional functionality of solving the ODEs over multiple timesteps.

Maybe we can discuss it tomorrow. And yes, this does sound like a hack project.

Could you try the problematic benchmark with single Advection, iterated Stokes? It may not be very accurate, but the solution for particles and fields should be much closer together?

Using iterated Advection and Stokes with max one NI or single Advection, iterated Stokes gives the same result. It is indeed much closer to the fields, but still a bit off: 3_viscoelastic_build-up_simple_shear_fields_particles_dtc_isnot_dte_SS_test_dt_GR2 (compare dark blue solid, dashed and dotted lines above)

3_viscoelastic_build-up_simple_shear_particles_dtc_isnot_dte_SS_NI (compare pink and blue line with vertical dash above)

anne-glerum avatar Jun 12 '23 15:06 anne-glerum

We've been discussing the options you have suggested above in our meeting yesterday, and came up with some potential ways of implementing the third option. Here's how I see it could be done:

  1. Add a function in helper_functions.cc (where the current operator splitting is housed). The function is called at the beginning of a nonlinear iteration, after the particles have been reset (e.g. after restore_particles() in line 1140 of solve_iterated_advection_and_stokes()). It is only called if (particle_world.get() != nullptr) and the Use elasticity = true (or we check for the two material models visco_elastic and visco_plastic, or there is a more general check for applying an update to the particle properties, or there is a signal) . The function loops over all particles, creating MaterialModelInput and MaterialModelOutput for each particle, filling the Input with the solutions from the last timestep at the particle's current (= old) position. Using these, the material model is evaluated. The returned reaction rate is multiplied with the current timestep and then added to the appropriate property (the stresses).
  2. Instead of a function, we add a signal post_restore_particles that connects to the same location, so after particles have been restored. The functionality that calls the material model for the reaction_rates and updates the relevant particle properties is placed in the particle property plugin elastic_stress. Other particle properties could use the same signal.
  3. We add in the update in the particle property elastic_stress before the reaction_terms are added. Downside is that the properties are updated after the particles have been advected, so we would have to retrieve the old particle locations (maybe from the particle world backup?) and evaluate the old solution at those locations.

Option 1 and 2 are the most straightforward I think. This would probably be a separate PR?

anne-glerum avatar Jun 14 '23 08:06 anne-glerum

Option 2 sounds the most appealing to me, because it introduces the least amount of specific code into main ASPECT and almost all new code can go into the particle property plugin.

Long term we should unify the particle advection and particle update property to use the same integration scheme (currently we use one scheme for advection, and then do a single step update of the properties afterwards). But that is for another time.

gassmoeller avatar Jun 14 '23 22:06 gassmoeller

Option 2 sounds the most appealing to me, because it introduces the least amount of specific code into main ASPECT and almost all new code can go into the particle property plugin.

Alright! I can start with a separate PR that introduces the signal and has a test with a particle property plugin that uses the signal and does a simple update without calling the material model. Or shall I implement it directly in this PR so that we can see whether it fixes the issues?

anne-glerum avatar Jun 15 '23 10:06 anne-glerum

Let's add it to this PR, if it doesnt work out it is easier to remove it again this way. I would still recommend you start with a simpler test than the elasticity problem to make sure it works as intended.

gassmoeller avatar Jun 15 '23 14:06 gassmoeller

Alright, I first implemented the signal and a new particle property that just uses the signal to increase its property by a constant every timestep. That worked, showing the correct particle property value after a certain amount of time. It's on this branch: https://github.com/anne-glerum/aspect/tree/signal_after_particle_reset

Then I used this signal in the current PR and modified the elastic_stresses particle property to

  1. Initialize the stress tensor properties again (so the composition property is no longer used)
  2. Update the particle stress values after the particles have been restored at the beginning of a nonlinear iteration
  3. Use the stress values on the particles as material model input, not the field solutions, for both the new stress update using the material model reaction rates and the original property update that uses the material model reaction term

I also had to

  1. Change any AssertThrows that check use_operator_splitting to take into account that we can have it set to false, but still want to compute the reaction_rates in the material model. I used (this->get_parameters().mapped_particle_properties).count(this->introspection().compositional_index_for_name("ve_stress_xx")) as a condition to check whether the elastic_stress particle property needs the reaction rates, but I don't think that's the best way to do it, since the field name could be used without the particle property. Maybe we need an additional bool for the particles, or we check whether the specific particle plugin elastic_stress is used.
  2. Use the stress tensor on the particles as the old stress tensor to compute the reaction term in the material model
  3. Copy a lot of code from world.cc to fill the MaterialModelInputs, that could maybe be refactored.

Although it can be improved, it does work!

The dotted line shows the new behaviour, which is very similar to only allowing one NI before: 3_viscoelastic_build-up_simple_shear_fields_particles_dtc_isnot_dte_SS_test_dt_GR2_OS Note that there is still a small difference between field and particle results.

The error no longer increases when increasing the max amount of NI (dashed lines all plot on top of each other): 3_viscoelastic_build-up_simple_shear_particles_dtc_isnot_dte_SS_NI_noOS

And smaller timesteps still decrease the error (except when dt=0.00125s after shearing has stopped): 3_viscoelastic_build-up_simple_shear_particles_dtc_isnot_dte_SS_dt_noOS

anne-glerum avatar Jun 21 '23 09:06 anne-glerum

Hi @gassmoeller and @bobmyhill , I've gone through and updated the documentation and addressed your comments. I'll fix the merge conflict tomorrow, but even before that I think this is ready for another look.

anne-glerum avatar Jul 10 '23 05:07 anne-glerum

Hi Rene, I see this accumulated so merge conflicts, I'll update them tomorrow.

anne-glerum avatar Sep 26 '23 19:09 anne-glerum

Some test fail to run, I will address that asap.

anne-glerum avatar Oct 06 '23 09:10 anne-glerum

One test fails because it uses an elastic damper, which has been disabled. Not sure what to do there. All other failed tests except one run, but produce different output (e.g., more fields are solved for). The test that didn't run before (visco_plastic_vep_stress_build-up_yield_particles) needed some changes to the prm file and should run now.

anne-glerum avatar Dec 04 '23 15:12 anne-glerum

@anne-glerum: if you're happy that the performance is good, I guess we can turn the elastic damper back on. Is the "different output" correct as far as you can tell?

bobmyhill avatar Dec 04 '23 17:12 bobmyhill

Concerning the 36 failing tests, there are several reasons why they fail. First of all, we have made all setups with elasticity use DG for composition, with an iterative Advection and Stokes scheme. This add DOFs and nonlinear iterations. Second, there are more specific reasons:

  1. Needs an elastic damper, but this is no longer allowed: 229 - damped_viscoelastic_bending_beam

  2. Still some run errors due to prm settings: 945 - visco_elastic_top_beam 1035 - visualization_shear_stress_viscoelastic 1042 - visualization_stress_viscoelastic

  3. Uses the DG limiter with a non-stress field. Therefore that field's advection equation doesn't convergence with the now multiple NI per timestep: 285 - elastic_stabilization_time_scale_factor 402 - infill_density_flexure 982 - visco_plastic_vep_bending_beam -> does 100 NI -> change max NI 984 - visco_plastic_vep_plate_flexure 1012 - viscoelastic_bending_beam 1013 - viscoelastic_bending_beam_averaging 1014 - viscoelastic_bending_beam_gmg 1016 - viscoelastic_bending_beam_principal_stress 1020 - viscoelastic_plate_flexure

  4. Because of the implemented changes, now the resulting velocity or topography statistics changed significantly: 116 - boundary_traction_function_cartesian_free_surface
    117 - boundary_traction_function_spherical_free_surface 307 - free_surface_VE_cylinder_2D_loading_fixed_elastic_dt_gmg -> maybe don't use dte = 10 dtc 319 - free_surface_timestep_repeat -> doesn't repeat 338 - gmg_mesh_deform_ghost_entries 812 - skip_refinement

For completeness, here the list of tests whose outputs have changed, but whose setups don't need to be changed:

  1. Because of switching on DG, the number of DOFs has increased and therefore the output files (these don't need fixing): 987 - visco_plastic_vep_stress_build-up_fixed_elastic_time_step 1015 - viscoelastic_bending_beam_particles 1017 - viscoelastic_fixed_elastic_time_step 1018 - viscoelastic_numerical_elastic_time_step 1019 - viscoelastic_numerical_elastic_time_step_particles 1025 - viscoelastoplastic_yield_plastic_viscous_strain_weakening

  2. Because of switching on an iterative solver scheme, now more (but small number of) NI are done (don't need fixing): 986 - visco_plastic_vep_stress_build-up 988 - visco_plastic_vep_stress_build-up_stress_averaging 1022 - viscoelastic_stress_averaging 1023 - viscoelastic_stress_build-up 1024 - viscoelastic_stress_build-up_gmg

  3. Because of switching on iterated Advection, new solves in screen output (don't need fixing): 548 - newton_postprocess_nonlinear 983 - visco_plastic_vep_brick_extension 985 - visco_plastic_vep_simple_shear 989 - visco_plastic_vep_stress_build-up_yield 990 - visco_plastic_vep_stress_build-up_yield_particles 1021 - viscoelastic_sheared_torsion

anne-glerum avatar Dec 19 '23 10:12 anne-glerum

Concerning the tests in categories 1-4:

  1. As mentioned by Bob, there are others who wish to implement a more comprehensive damper scheme once this PR is merged. Perhaps we can put this test on hold somehow?
  2. I'm trying to fix these test prms. EDIT: Done. They fall into the no need to fix category.
  3. Related to #5529 and #5531.
  4. These tests need to be checked whether they still do what they are supposed to do.

anne-glerum avatar Dec 19 '23 13:12 anne-glerum

@gassmoeller What do you think about test 229 with the elastic damper that we have disabled in this PR? Could we change the test results to succeed if the Assert for the damper is triggered? Or is there a way to both preserve the current/main test output and have the test succeed with the new implementation?

For these tests: 319 - free_surface_timestep_repeat -> doesn't repeat 338 - gmg_mesh_deform_ghost_entries 812 - skip_refinement I'll see what I can do to get them closer to the original results, but if that doesn't work out, how do you feel about removing the elasticity? Elasticity doesn't seem integral to what these tests are testing so I could remove it in a separate PR.

anne-glerum avatar Dec 19 '23 16:12 anne-glerum