tslearn
tslearn copied to clipboard
LearningShapelets: impossible to pickle the model
https://github.com/tslearn-team/tslearn/blob/42a56cce63d8263982d616fc2bef9009ccbedab4/tslearn/shapelets/shapelets.py#L212-L886
I'm having issues trying to save my fitted model, getting this error:
AttributeError: Can't pickle local object 'make_gradient_clipnorm_fn.<locals>.<lambda>'
How to handle it?
To reproduce:
from tslearn.shapelets import LearningShapelets
clf = LearningShapelets(n_shapelets_per_size={20:1, 30:1},
max_iter=1000,
optimizer=Adam(learning_rate=0.1),
verbose=0)
clf.fit(X_train, y_train)
clf.to_pickle("path/to/file.pkl")
Versions: python 3.6 tslearn 0.5.2 tensorflow 2.6.2
Does the fix described in #387 resolve the issue?
Does the fix described in #387 resolve the issue?
@GillesVandewiele Nope, because this tryies to solve the 'loading' pointing out that "from_" is a method class, but I don't see nothing about saving the model different from what I did. :(
It could perhaps be related to an older version of tensorflow, or something related to the optimizer being used: https://github.com/NREL/phygnn/issues/16
One way to circumvent the issues is to just store the learned shapelets and load them in again when doing inference, these contain all the valuable information of a shapelet model.
One way to circumvent the issues is to just store the learned shapelets and load them in again when doing inference, these contain all the valuable information of a shapelet model.
@GillesVandewiele Yes, this is absolutely the way of hacking this issue.
For the version of tensorflow I'm using is 2.6.2 (updated the original post)
@alevangel : could you post a code snippet that does that? This would be helpful:
- for other users to see how to handle such errors
- in the future for
tslearn
devs (maybe you?) to figure out how to implement a fix
I undesrtood that the problem is for the Adam optimizer, that have problem to be serialized. I'm trying to understand how to fix it.
@rtavenar sure, I'll share it here as soon as I have it.
Just a quick fix, since the problem is the Adam optimizer, and after the fit i won't need it anymore I just removed the optimizer.
dummy = LearningShapelets(*args) # with the standard optimizer
my_clf = LearningShapelets(*args) # with the Adma optimizer
my_clf.fit(data)
my_clf.optimizer = dummy.optimizer
my_clf.to_pickle(path)