SpineOpt.jl
SpineOpt.jl copied to clipboard
WIP: Merge economic representation of investments
With this merge request, we will allow for multi-period investments in SpineOpt while taking into account the value of time. All costs are discounted accordingly. Also, the user is now able to define lead times, technical- and economical lifetimes, decommissioning times for units, nodes, and connections. Units and nodes can also be (de-)mothballed, or retired early. Finally, we differentiate between investment_variables
and investment_variables_vintage
, which will also allow us to differentiate technology characteristics by vintage year, and to include degradation long term.
Before the merge can be executed the following elements still need to be addressed:
- [ ] Restore functionality
- [ ] Fix Benders decomposition accordingly
- [ ] Add all investment variables to the spineopt_template
- [ ] Double check variable creation
- [ ] Document new functionality
- [ ] Document new economic parameters (technology discount rate, discount rate, discount duration, lead time, mothballing costs, capacity transfer factor, conversion to discounted annuities etc.)
- [ ] Document new investment variables (investment state, investment available, early decommissioned, decommissioned, de-mothballed ...
- [ ] Document new investment constraints (investment state, investment available, early decommissioned, decommissioned, de-mothballed ...
- [ ] Write tests for all new functionality
- [ ] For economic parameters
- [ ] For new constraints
- [ ] Tests should also cover e.g. stochastic scenario dependent leadtimes, and rolling horizon
- [ ] Double check scenario implementation for parameters -> possibly add model dimension
- [ ] Write migration script
- [ ] New parameters, but also renaming -> unit_lifetime -> unit_economic/technical_lifetime
- [ ] Define reasonable defaults
- [ ] By default, the user should not need to define anything, e.g. to run an operational model
- [ ] Make sure that economic structure doesn't need to be generated if problem is operational only
- [ ] Should univested__invested_available be fixed to 0 at the start or rather units_invested if not defined? (one follow the other). Also units_invested_available, if defined would fail without corresponding investment variable?
- [ ] Data structure check
- [ ] Economic lifetime smaller equal technical lifetime
- [ ] If only one lifetime exists, assume they are equivalent + warning
- [ ] Warn if only decommissioning time is given but not the decommissioning costs
- [ ] Error if discount rate is larger than 1
- [ ] If lead_time
nothing
should default to0Years
- [ ] If we want to disallow Benders + Milestone for the moment -> throw error accordingly
This one will first require an in-depth discussion on what the best/desired way is to incorporate vintage and economic discounting into SpineOpt in the future. Then we can see whether there is still value in moving forward with this branch. Related issues are #325 #326 #601 . I believe we also had a discussion on this at some point, but I can't find it back.
#476 is also related to this
The main takeaways from the hackathon in the WP4 (MOPO project):
-
The stochastic scenario vintage in the current branch does not support if the
lifetime
changes among scenarios (maybe the WACC, too). Add a caveat in the documentation. -
The current implementation supports stochastic structure for the discounted rate.
-
(fix)
unit_discounted_durantion
-> It doesn't work for the ones being invested (but it works for those not being supported). -
Another caveat -> it is not supported when using Benders' decomposition (The combination Milestone years + Benders' decomposition is not supported). If no milestone years are selected, then you can use the decomposition.
-
We need to double-check the variable creation for the vintage variables according to the new structure.
-
keyword in the code =>
history_long
(good for speed up the code). This parameter will help differentiate between variables needing history in the constraints and those not. We need a separate issue. Double-check how much history triggers the lifetime parameter. This problem has a separate issue; see #606 -
The final version should allow to specify two options for the user: The existing one and the milestone year option (refactor the code to have that)
-
Include tests (must), documentation, migration script (must), and tutorial
-
Rolling horizon is not supported (future improvement)
-
Profile both branches with the exact case study (double-check the existing tutorials)
-
What if the investment is at half of the year? (maybe, it is not supported)
-
More details in #476
Hi @mihlema, we investigated one problem that was left from the hackathon in Leuven.
- (fix)
unit_discounted_durantion
-> It doesn't work for the ones being invested (but it works for those not being supported).
We found the cause of this error, and proposed a change to resolve it. But we don't know yet if the change is correct or not, because it is more related to the intention of your initial code. Maybe you can see more easily what should be changed from there.
Here's a screenshot of the story:
The yellow boxes show the cause of the initial error and our proposed changes. The problem occurs because for the wind farm, there is only one stochastic scenario, so the function _find_children
returns empty. We propose to first check whether this is empty or not, if it's empty, use the root scenario instead.
The orange line and arrow show a new error. When we take the root scenario, there's something wrong with the t.block
.
Since we are not fully aware of the intention of this stochastic part, could you have a look and tell us what to do?
Thank you!
List of existing tests (marked if it passes):
- [x] data_structure/migration.jl
- [x] data_structure/check_data_structure.jl
- [x] data_structure/preprocess_data_structure.jl
- [ ] data_structure/temporal_structure.jl: it fails due to the long/short history
- [x] data_structure/stochastic_structure.jl
- [ ] data_structure/algorithm_mga_structure.jl: all tests failing
- [x] data_structure/postprocess_results.jl
- [ ] constraints/constraint_unit.jl: 19 tests failing
- [ ] constraints/constraint_node.jl: 11 tests failing
- [ ] constraints/constraint_connection.jl: 22 tests failing
- [ ] constraints/constraint_user_constraint.jl: 6 tests failing
- [ ] constraints/constraint_investment_group.jl: all tests failing
- [ ] objective/objective.jl: 2 tests failing
- [x] util/misc.jl
- [x] run_spineopt.jl
- [ ] run_spineopt_benders.jl: all tests failing
- [x] run_examples.jl:
List of new tests in data_structure/check_economic_structure.jl (mark checked if tests are passing)
Already added
- [x] "test discounted duration - using milestone years, w/o inv. blocks"
- [x] "test discounted duration - w/o using milestone years"
- [x] "test investment costs, salvage fraction, decommissioning"
- [x] "test investment cost scaling"
- [x] "test technological discount factor"
To be added
- [ ] "test capacity transfer factor"
Add new tests for the following:
objective
- [ ] objective/unit_fixed_om_costs.jl
- [ ] objective/storage_fixed_om_costs.jl
- [ ] objective/connection_fixed_om_costs.jl
- [ ] objective/unit_decommissioning_costs.jl
- [ ] objective/unit_mothballing_costs.jl
- [ ] objective/connection_decommissioning_costs.jl
- [ ] objective/storage_decommissioning_costs.jl
- [ ] objective/storage_mothballing_costs.jl
unit constraints
- [ ] constraints/constraint_units_decommissioned_vintage.jl
- [ ] constraints/constraint_units_decommissioned.jl
- [ ] constraints/constraint_units_invested_available_bound.jl
- [ ] constraints/constraint_units_invested_available_vintage.jl
- [ ] constraints/constraint_units_invested_available.jl
- [ ] constraints/constraint_units_invested_state_vintage.jl
- [ ] constraints/constraint_units_invested_state.jl
- [ ] constraints/constraint_units_mothballed_state_vintage.jl
connections constraints
- [ ] constraints/constraint_connections_decommissioned_vintage.jl
- [ ] constraints/constraint_connections_decommissioned.jl
- [ ] constraints/constraint_connections_invested_available_bound.jl
- [ ] constraints/constraint_connections_invested_available_vintage.jl
- [ ] constraints/constraint_connections_invested_available.jl
storage constraints
- [ ] constraints/constraint_storages_decommissioned_vintage.jl
- [ ] constraints/constraint_storages_decommissioned.jl
- [ ] constraints/constraint_storages_invested_available_bound.jl
- [ ] constraints/constraint_storages_invested_available_vintage.jl
- [ ] constraints/constraint_storages_invested_available.jl
- [ ] constraints/constraint_storages_invested_state_vintage.jl
- [ ] constraints/constraint_storages_invested_state.jl
- [ ] constraints/constraint_storages_mothballed_state_vintage.jl
Very nice job so far! I added a few comments here and there.
So the change is, to say the least, substantial. How do we gain enough confidence to merge this into master? It would be nice to see a more detailed discussion of the code design, to help one understand the different decisions. The file economic_structure.jl seems very complicated too.
Any chance we can split this into smaller, more managable PRs? At the moment it feels very overwhelming.
What do you think @datejada @g-moralesespana @jkiviluo @gnawin ?
Hi @manuelma, thanks for the comments; we really need a deep review and understanding of the changes. So, we need more discussion about the changes and refactoring to improve readability in the new module. We will gain confidence in the proposed changes if we fix the tests. That's why I advocate to work on that part first 😉
Hi @manuelma, thanks for the comments; we really need a deep review and understanding of the changes. So, we need more discussion about the changes and refactoring to improve readability in the new module. We will gain confidence in the proposed changes if we fix the tests. That's why I advocate to work on that part first 😉
Sounds good @datejada - one thing though is, most of the tests validate that the model generates a certain constraint expression. If you change the constraint in the source, you also need to change it in the tests. However this doesn't test that the results of the model are correct or, more accurately, that the results for the a certain dataset are preserved after a change in constraint source code. This is a flaw of our test-suite.
So what we'd need to do in my opinion is expand our test suite with tests that check the results for a good number of representative investment models under the current design. If these tests aren't broken after changing the design, I would feel much more comfortable. But then again that can be a lot of work. I don't know how much time you can allocate to this...
Another thought is, this is your first PR and it's hundreds if not thousands of lines of code. Wouldn't you feel more comfortable starting with something more bounded? One suggestion might be to try and fix that issue with the history (I can give you some pointers, maybe we can even have a chat) so that you gain more confidence and get to know some of the internal mechanics much better. That's just an idea of course.
Hi @manuelma, thanks for the comments; we really need a deep review and understanding of the changes. So, we need more discussion about the changes and refactoring to improve readability in the new module. We will gain confidence in the proposed changes if we fix the tests. That's why I advocate to work on that part first 😉
Sounds good @datejada - one thing though is, most of the tests validate that the model generates a certain constraint expression. If you change the constraint in the source, you also need to change it in the tests. However this doesn't test that the results of the model are correct or, more accurately, that the results for the a certain dataset are preserved after a change in constraint source code. This is a flaw of our test-suite.
So what we'd need to do in my opinion is expand our test suite with tests that check the results for a good number of representative investment models under the current design. If these tests aren't broken after changing the design, I would feel much more comfortable. But then again that can be a lot of work. I don't know how much time you can allocate to this...
Another thought is, this is your first PR and it's hundreds if not thousands of lines of code. Wouldn't you feel more comfortable starting with something more bounded? One suggestion might be to try and fix that issue with the history (I can give you some pointers, maybe we can even have a chat) so that you gain more confidence and get to know some of the internal mechanics much better. That's just an idea of course.
Hi @manuelma, @gnawin and I were discussing and pondering the options, and we agree that we should split the changes in more controlled PRs. So, we have created an Epic issue #908 with the guideline of changes (based on all the information in this PR) and from there, we can create more minor issues to merge one by one. The strategy remains the same. Reuse all the work from @mihlema, but change the code by smaller PRs.
Closed in favor of #929