revisit-severson-et-al icon indicating copy to clipboard operation
revisit-severson-et-al copied to clipboard

Reproducing the "discharge" and "full" model

Open fingertap opened this issue 2 years ago • 3 comments

Hi, thanks for this very good and organized codebase.

Following your code, the "variance" model is easy to reproduce. However, I have trouble reproducing the performance of the "discharge" and "full" model. Do you know what detail I may be missing?

My results are:

(103.5718828346258, 138.2982760624032, 195.8641688228285)    # var
(62.6994702955215, 181.47910490241762, 194.8456325157642)    # discharge
(62.61943481775296, 141.89261700359316, 170.8711847567794)    # full

Here is my code for feature generation:

def var_feature(batch, to_skip=None):
    feats = []
    to_skip = to_skip or []
    if isinstance(to_skip, int):
        to_skip = [to_skip]
    for indx, cell in enumerate(batch):
        if indx in to_skip:
            continue
        x = cell['cycles']['99']['Qdlin'] - cell['cycles']['9']['Qdlin']
        feats.append([np.log10(np.var(x))])
    return np.array(feats)

def discharge_feature(batch, to_skip=None, freq=10):
    feats = []
    to_skip = to_skip or []
    if isinstance(to_skip, int):
        to_skip = [to_skip]
    for indx, cell in enumerate(batch):
        if indx in to_skip:
            continue
        cell_feat = []
        x = cell['cycles']['99']['Qdlin'] - cell['cycles']['9']['Qdlin']
        cell_feat.append(np.var(x[::freq]))
        cell_feat.append(np.abs(np.min(x[::freq])))
        cell_feat.append(np.abs(skew(x[::freq])))
        cell_feat.append(np.abs(kurtosis(x[::freq])))
        cell_feat.append(cell['summary']['QD'][1])
        cell_feat.append(cell['summary']['QD'][1:100].max() - cell['summary']['QD'][1])
        feats.append(np.log10(cell_feat))
    return np.array(feats)

def full_feature(batch, to_skip=None):
    feats = []
    to_skip = to_skip or []
    if isinstance(to_skip, int):
        to_skip = [to_skip]
    for indx, cell in enumerate(batch):
        if indx in to_skip:
            continue
        cell_feat = []
        x = cell['cycles']['99']['Qdlin'] - cell['cycles']['9']['Qdlin']
        cell_feat.append(np.log10(np.var(x)))
        cell_feat.append(np.log10(np.abs(np.min(x))))
        m = LinearRegression().fit(
            np.ones((99, 1)),
            cell['summary']['QD'][1:100])
        cell_feat.append(np.abs(float(m.coef_)))
        cell_feat.append(np.abs(m.intercept_))
        cell_feat.append(np.log10(cell['summary']['QD'][1]))
        cell_feat.append(np.log10(cell['summary']['chargetime'][:5].mean()))
        cell_feat.append(np.log10(cell['summary']['Tavg'][1:100].sum()))
        cell_feat.append(np.log10(cell['summary']['IR'][1:100].min() + 1e-8))
        cell_feat.append(np.log10(abs(cell['summary']['IR'][100] - cell['summary']['IR'][1])))
        feats.append(cell_feat)

    return np.array(feats)

I just use a linear regression after the standard scaler:

def train(x_train, x_test1, x_test2):
    scaler = preprocessing.StandardScaler().fit(x_train)
    x_train = scaler.transform(x_train)
    x_test1 = scaler.transform(x_test1)
    x_test2 = scaler.transform(x_test2)

    # Define and fit linear regression via enet
    # l1_ratios = [0.1, 0.5, 0.7, 0.9, 0.95, 0.99, 1]
    # enet = ElasticNetCV(l1_ratio=l1_ratios, cv=5, random_state=0)
    enet = LinearRegression()
    enet.fit(x_train, y_train)

    # Predict on test sets
    y_train_pred = enet.predict(x_train)
    y_test1_pred = enet.predict(x_test1)
    y_test2_pred = enet.predict(x_test2)

    # Evaluate error
    return get_RMSE_for_all_datasets(y_train_pred, y_test1_pred, y_test2_pred)

fingertap avatar Dec 09 '22 08:12 fingertap

Hi, sorry for the delayed response. Just to confirm, you're saying your run of the notebook gave you different results than what is posted here?

petermattia avatar Jul 11 '23 00:07 petermattia

Actually, I can reproduce all your results in the paper “statistical learning for accurate and interpretable battery lifetime prediction”, including the variance model. However, I cannot reproduce the discharge and full model from the Nature Energy paper. I cannot find it in this repo either.

fingertap avatar Jul 11 '23 02:07 fingertap

Unfortunately, you need the license from Prof. Braatz for access to this code (see the readme). Doubly unfortunately, he rarely responds to requests for this license. Sorry about that.

petermattia avatar Jul 11 '23 03:07 petermattia