Make NuclideBases not global
The data in nuclideBases.py is global, which leads to considerable pain:
- Parallel unit checks choke on this a lot, and have to build work arounds.
- Because this is global, it is nigh impossible to have a setting supply different
nuclide.datinput data. - The order is very sensitive, which makes import time logic complicated.
- The burn chain logic is very sensitive and can only be run once.
- This limits our LFP logic.
Fixing this problem has wide support among our user base, but has proven quite time-consuming, so we will break this work into different PRs:
- [x] PR Make the classes
NuclideBasesandElementsand use them to encapsulate all the global data innuclideBases.pyandelements.py. Put one instance of each of these in the global scope to corral the data, keep current API endpoints. (Backwards compatible.) - [x] PR Make thermal scattering not global and simplify the logic drastically.
- [x] PR Add a reference to the global nuclide data in
Reactor, and let everyCompositedirectly reference it as well. Rewrite any code we can, without breaking the API. (Backwards compatible.) - [ ] Ditch global nuclides. Finally remove all global nuclides. Exclusively use
Reactor.nuclideBases. (API breaking.) - [ ] Add nuclides.dat Setting. Now people can provide their own
nuclides.datfile.
We made a step in a direction where global nuclide bases are less bad than they used to be by adding more nuclides in #998. Unfortunately, the nuclide data is still global and the burn chain modifications still may occur when it's being imposed. I specifically added logging statements if a user overwrites any default parameter from the burn-chain.yaml or custom file, but the transmutation and decay data that is a subset of the nuclides still gets modified based on the user-defined burn chain data.
We can probably take this further still.
@opotowsky
This has reared it's head again, and will have to be put on the calendar for ARMI 0.4.0. It turns out ARMI enforces everyone to use RIPL nuclear data, but we have some users that use ENDF in other parts of their modeling chain. So I would LOVE to give people a setting to choose which nuclear data they want. But we can't do that (easily) if the nuclear data lives in the global scope at import time.
So, finally, I think we have enough interest to make this ticket happen.
I notice that it is not just nuclideBases that is global, but also elements and thermalScattering. We even have an unrelated ticket open to rip thermalScattering out of the global scope.
So, let's try to itemize the nuclide information that is currently in the global scope, and where (outside the nucDirectory area) it is used:
nuclideBases
| Name | Usage |
|---|---|
| byName | * very popular in downstream projects * various materials * NuclideFlag * component * composite * densityTools |
| byDBName | * one downstream plotting function |
| byLabel | * two downstream projects * Uranium.setDefaultMassFracs * xsNuclides.updateBaseNuclides |
| byMcc2Id | * unused |
| byMcc3Id | * downstream dif3d and depletion codes * dlayxs * isotopicOptions.getAllNuclideBasesByLibrary |
| byMcc3IdEndfbVII0 | * only used in this file |
| byMcc3IdEndfbVII1 | * only used in this file |
| byMcnpId | * downstream burnx and mcnp code |
| byAAAZZZSId | * single isotxsSample file in nucleardata-uq |
elements
| Name | Usage |
|---|---|
| byZ | * blocks::getPuMoles * isotopicOptions::autoUpdateNuclideFlags() * Composite.constituentReport() |
| byName | * isotopicOptions.eleExpandInfoBasedOnCodeENDF(cs) |
| bySymbol | * many places downstream * isotopicOptions::ALLOWED_KEYS and .expandMassFracs * Composite::getNuclidesFromSpecifier and .constituentReport() * Water.setDefaultMassFracs |
thermalScattering
| Name | Usage |
|---|---|
| byNbAndCompound | * materials like BE9 and Graphite * OpenMC plugin |
Materials
I am a touch worried about the current state of ARMI materials because of this "global nuclides" situation. Currently, if you update the burn chain, all the global nuclides can be updated. Which means that the mass fractions of materials can be changed in the middle during a run?
That is a dangerous idea in parallel unit tests.
@aaronjamesreynolds @opotowsky And anyone else who is interested... I have a branch out now to start this this ticket: obliterate_global_nuclides.
(To see the current state of the branch go here.)
All these global will be moved into this new class:
https://github.com/terrapower/armi/blob/a76a238dd936fe43e7981b608a3e40228f72ce22/armi/nucDirectory/nuclideBases.py#L1470
Note the big table above. This is only the start. But merge freeze or not, I can start thinking about all this.
This looks good @john-science. Are you thinking that this class would be instantiated on an application basis and then once the application is configured we can call through app.nuclides or did you have something else in mind for how it should be connected and configurable?
Are you thinking that this class would be instantiated on an application basis ...
Well, at the moment, I am guessing the nuclides bases have to live on the Reactor, so r.nuclideBases. But that's just a guess. The truth is the solution will have to derive from the table above.
That table shows where the nuclide bases are used. So, the usage of x.nuclideBases will determine where they have to live. So, here can we put this new object that causes the fewest changes in ARMI and downstream?
It is probable that no matter where I put the new object, there will have to be a some breaking changes (probably in the materials). BUT choosing a good place to put this new object will cut the work needed for this change down by 2x or even 10x.
Two cents: R.nuclides over App.nuclides
Rationale: interfaces always have access to the reactor, not necessarily the application. And the app is already a global thing, so putting the nuclides there isn't really getting rid of global nuclides. It's just moving them.
Okay, this ticket is important, but has hung around too long. I keep being told it is too big for the current release, and the like.
So, to mitigate those concerns, I will be breaking this feature change into three PRs. The first two PRs will be non-API-breaking and entirely internal to ARMI. This way the work will not affect downstream users for as long as possible. And, if necessary, will allow me to spread the work over multiple releases.
- Make the classes
NuclideBasesandElementsand use them to encapsulate all the global dictionaries innuclideBases.pyandelements.py. Put one instance of each of these in the global scope to corral the data, keep current API endpoints, but point to the new classes under the hood. (Backwards compatible.) - Add a reference to the global nuclide data in
Reactor, and let everyCompositedirectly reference it as well. Rewrite any code we can, without breaking the API or causing downstream changes, using the newReactor.nuclideBases. (Backwards compatible.) - Ditch global nuclides. Finally remove all global nuclides. Exclusively use
Reactor.nuclideBases. (API breaking.)
It may seem like overkill to break this ticket into three PRs, but at this point I believe it is necessary.
I love this plan!