tslearn icon indicating copy to clipboard operation
tslearn copied to clipboard

Add keras implementation of softdtw

Open Ivorforce opened this issue 1 year ago • 1 comments

Let me know if this looks correct. I tested it and for equal arrays it returns a negative value. I think it's because softmin can return values less than the input values, but I guess that makes sense if it has to be differentiable.

Anyway, here's a minimal use example:

from keras import layers

model = keras.Sequential([
    layers.InputLayer(input_shape=(15, 1)),
])
model.compile(
    optimizer=keras.optimizers.Adam(0.001),
    loss=SoftDTWLoss()
)
history = model.fit(
    np.arange(15, dtype=float)[None, :, None],
    np.arange(15, dtype=float)[None, :, None],
    epochs=6,
)
print(model.predict(np.arange(15, dtype=float)[None, :, None]))

TODO

  • [ ] Docstrings updated
  • [ ] Code cleaned up

Ivorforce avatar Mar 07 '24 13:03 Ivorforce

I added an even more shoddy version for the gradient. There are definitely some cleanup things to do, but at least it runs with the custom gradient function now.

I am currently getting some graph size issues when trying to use the loss with anything that's not trivial, which is I think caused by the dtw being so recursion intensive. I may have to give up on using dtw for my own project unless I can somehow work around that, but I'll leave the PR up either way for future reference, or other people to fix and use for their projects.

Ivorforce avatar Mar 07 '24 18:03 Ivorforce