PyBaMM icon indicating copy to clipboard operation
PyBaMM copied to clipboard

Zeroth order interpolation

Open aabills opened this issue 2 years ago • 7 comments

Description

This feature would add an option to pybamm.Interpolate that would allow for holding the of the interpolant at the previous value until the next value is reached. For example, if the interpolant f has the following knots and values

knots = [0,1,2]
values=[10,0,0],

then f(0.5)=10 and f(1.5)=0

Motivation

It seems natural to me for some drive cycles to use a zero order hold interpolation. In other words, rather than linearly interpolating the current/power/etc, hold at the previous one until the next time point is reached. This is implicitly how the string experiments work, i.e. ["Discharge at 3A for 25 seconds","Rest for 25 seconds"] holds at 3A until 25 seconds rather than linearly changing the current. However, since pybamm.Interpolate only supports linear, cubic, and pchip interpolation, this is a little bit cumbersome right now.

Possible Implementation

It would be pretty straightforward to add a zero order hold option to interpolate (it is supported by scipy.interpolate). I think only 1D is necessary, I can't think of a case where one would want this in multiple dimensions.

Additional context

if there's already an easy way of doing this lmk, otherwise I am going to implement it

aabills avatar Oct 12 '23 18:10 aabills

Sounds reasonable to include this, the challenge you might face is if wanting to use the casadi solver is there a zeroth order interpolation in casadi?

valentinsulzer avatar Oct 12 '23 22:10 valentinsulzer

It doesn't, I guess we'll have to get a bit creative.

aabills avatar Oct 12 '23 22:10 aabills

You could do a bunch of heavisides (though the expression tree will get huge). Or pre-process the data with a zeroth order interpolator to make a very fine grid with constant steps then use a linear interpolator

valentinsulzer avatar Oct 12 '23 23:10 valentinsulzer

yeah, heavisides was my first thought, but a couple of problems exist with that, one being what you said (my guess is we will get overflow errors with even pretty reasonably sized drive cycles), and its also going to be O(N) rather than the optimal O(log(N)). will come back to this and let you know if I come up with something reasonable

aabills avatar Oct 12 '23 23:10 aabills

My recommendation for this with existing pybamm functionality would be to pre-process the drive cycle data by adding more points to simulate a zeroth-order interpolant. e.g. add a point just before each step with the same value as the previous step

valentinsulzer avatar Mar 30 '24 16:03 valentinsulzer

Quick and dirty heaviside version:

import pybamm
import math

def zoh(ts, vals, t):
    if len(ts) > 2:
        vec_size = len(ts)
        mid = math.floor(vec_size / 2)
        middle_t = pybamm.Scalar(ts[mid])
        switch = (t <= middle_t)
        return switch * zoh(ts[0:mid], vals[0:mid], t) + (1 - switch) * zoh(ts[mid:], vals[mid:], t)
    elif len(ts) == 2:
        v1 = vals[0]
        v2 = vals[1]
        switch = (t <= ts[1])
        return v1 * switch + v2 * (1 - switch)
    elif len(ts) == 1:
        return vals[0]
    else:
        raise ValueError

aabills avatar Jun 10 '24 19:06 aabills