Add demand elasticity to speed up optimization
Context
Formulating the PyPSA optimization as a QP problem and adding elasticity appears to significantly reduce the solving time. This has been observed by @fneum with a PyPSA electricity demo model. The exact reason for the performance improvement is still under discussion. Nevertheless, testing the new formulation within PyPSA-Eur seems worthwhile.
Idea
Let's incorporate the QP formulation and demand elasticity for the electricity sector, using typical realistic values. For other sectors, assuming a realistic demand elasticity value will be challenging. One possible approach to also include other sectors is to add a very steep gradient, which can enhance solver speed without altering the results' assumptions.
If someone is interested in making a paper out of that, ping on the PyPSA meets Earth discord.
Out of curiosity I tried this out kind of quickly, but so far I'm unfortunately not having much success... Here's the essence of my implementation:
# load_shedding["willingness_to_pay"] is loaded from config, in EUR/kWh
load_by_bus = n.loads_t.p_set.T.groupby(n.loads.bus).sum().T
AC_buses = n.buses[n.buses.carrier.isin(["AC", "low voltage"])].index.intersection(
load_by_bus.columns
)
load_by_bus = load_by_bus.loc[:, AC_buses]
load_shedding_i = load_by_bus.columns
n.madd(
"Generator",
load_shedding_i,
" load shedding",
bus=load_shedding_i,
carrier="load",
# Here, we convery back to EUR/MWh
marginal_cost_quadratic=(1e3 * load_shedding["willingness_to_pay"]) / (2 * load_by_bus),
marginal_cost=0,
p_nom=load_by_bus.max(),
)
This is added in solve_network.py as an alternative to the "regular" load shedding. I added the "willingness_to_pay" coefficient (should probably be named something better) to the config, and a wildcard option for it in order to compare results easily.
It sort of seems to work, but it's entirely possible that I made a mistake, so please correct me if I'm wrong!
The following results are from a very "default" pypsa-eur network, 40 clusters, 50seg time resolution, single planning horizon at 2040.
As expected, total system costs are lower with more elastic demand:
Here and below, "elastic20" (for example) means the "willingness_to_pay" config option was set to 20. The higher the number, the less elastic the demand.
However, marginal prices are not all that different:
The real bummer is that solving times are worse, not better:
In the above, I applied demand elasticity only for AC/low voltage buses. I also tried adding demand elasticity at all buses that have some load, but this made solving times significantly worse.
First of all, we can hope that I made some mistake my implementation, and that a correct implementation would actually bear fruits? If not, there are a few big difference between this example and the setup in the price formation paper:
- One node / 40 nodes
- Constant demand / varying demand
- Electricity only / sector coupling
- High temporal resolution / low temporal resolution The best hope for still finding solving time improvements might be that these benefits only materialise with relatively high temporal resolution? I'm just checking with some higher resolution models and will update this once the results are in.
Has anyone else tried this out and gotten similar results? Something totally different?
So far I haven't been able to find any performance gains for larger models. In fact, for a 40-node, 1000seg european-wide model I got the following:
Optimize a model with 8396199 rows, 3958969 columns and 20102628 nonzeros
Model fingerprint: 0xd959d976
Model has 40000 quadratic objective terms
Coefficient statistics:
Matrix range [2e-03, 1e+03]
Objective range [2e-02, 2e+06]
QObjective range [3e+00, 8e+03]
Bounds range [9e-02, 4e+10]
RHS range [2e-01, 9e+08]
Warning: Model contains large bounds
Consider reformulating model or setting NumericFocus parameter
to avoid numerical issues.
Presolve removed 4334219 rows and 330000 columns (presolve time = 5s) ...
Presolve removed 4798421 rows and 794202 columns (presolve time = 10s) ...
Presolve removed 4798421 rows and 822667 columns
Continuous model is non-convex -- solving as a MIP
Presolve removed 4396272 rows and 393057 columns (presolve time = 5s) ...
Presolve removed 4396272 rows and 393057 columns (presolve time = 10s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 15s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 20s) ...
Presolve removed 4793219 rows and 790004 columns (presolve time = 25s) ...
Presolve removed 4793219 rows and 796331 columns (presolve time = 31s) ...
Presolve removed 4802407 rows and 805519 columns (presolve time = 35s) ...
Presolve removed 4805102 rows and 808661 columns (presolve time = 40s) ...
Presolve removed 4805102 rows and 808661 columns (presolve time = 45s) ...
Presolve removed 4807102 rows and 809662 columns (presolve time = 50s) ...
Presolve removed 4807102 rows and 809662 columns
Presolve time: 50.50s
Presolved: 3589099 rows, 3189308 columns, 13560273 nonzeros
Presolved model has 39999 quadratic constraint(s)
Presolved model has 1 bilinear constraint(s)
Variable types: 3189308 continuous, 0 integer (0 binary)
Root relaxation presolved: 3545114 rows, 3105650 columns, 13388930 nonzeros
Gurobi wasn't able to solve this model. ("willingness_to_pay" set to 50 EUR/kWh.)