Account for baseline calculations for Saving Sessions
Octopus Saving Sessions use your baseline usage over the previous 10 equivalent slots to work out your energy "saved" over the session. However, Predbat just adds the extra value to the slot for the full export amount.
E.g. Over the past 10 days I have exported at the full 6kW I am allowed by my DNO, but only on 5 days. Therefore, my baseline "import" is negative 3kWh. If I then export at the full 6kW again during the session, I will only receive the bonus for the extra 3kWh over my baseline. Predbat adds the bonus for the full 6kWh, so predictions are not correct, and neither is the total cost value.
The Octopus Plugin does have a baseline entity, but this is no good for prediction, because it only shows a single value for the first half hour slot, then updates for the second half hour slot. So, you would need to calculate the baseline figure manually from the historic data in HA, or from the API data. Maybe contact this developer as they already have the calculations worked out from the API.
https://savingsessions.streamlit.app/
Agreed, at the moment Predbat doesn't take account of the baseline import/export figure in working out the projected savings.
Is it worthwhile making this change?
In my view, calculating a more accurate figure would be 'better', but since the DFS savings figures have been so low (12/13p), its a bit of a moot improvement as participating in DFS (this year) brings in little real financial benefit. I've personally not exported anything in the last few sessions as I'd result in emptying the batteries earlier and then have to import later on at a higher rate. Your example of a 6kW export vs a 3kW baseline is DFS difference of 78p vs 39p.
We are also coming to the end of saving sessions for this year. There might be one or two more, but probably not many more. And whether they run next year, we will have to see.
Also a practical aspect of doing a baseline calculation using HA data, you'd need to extend the days history retained. HA default is 10 days history but DFS is calculated over the last 10 weekdays / 4 weekend days. I purge a lot of my entity history with shorter durations than this to shrink the HA database size. Ideally would use Long Term Statistics in HA but since this is aggregated to hourly figures they aren't suitable for calculating the half-hourly baseline. Could get the data from Octopus if Predbat knew your Octopus API key, but adds loads more complexity.
I've built something to do this (similar method to the Solcast adaptive damping @gcoan). I keep track of export during the peak hours over the previous 10 non saving session days (or 4 weekend days) then work out an adjusted "octopoints per penny" figure for each half hour slot. It's still a manual operation to work out an appropriate single rate to use for the whole saving session, then amend apps.yaml to change the "octopoints per penny" value.
I built this last year when rewards were better and I was on Flux so my export coincided with the typical saving session times. I'm now on IOG so don't export much during the saving session times and with the smaller rewards I'm not sure the extra (attempted) accuracy makes that much difference.
Hey @gcoan the good news is that Saving Sessions (or rather the underlying DFS) is year-round now! Of course we would expect the grid conditions to make a 'Saving Session' necessary are more likely in winter, but I'll be opting in for each of them whoever they pop up. I think it's been brilliant this winter to have PredBat up and running to decide if, and how much, I export for it. New look Demand Flexibility Service to go live next week