Sagar Mishra
Sagar Mishra
@fkiraly When we're saving to memory, I don't think it's possible to serialize two objects into a single variable, so we'd have to modify the in-memory serialization as: ```python if...
I'm making an early commit to explain how `keras` serialization is supposed to work for in-memory. As explained in my comment above, we'd now return a tuple of size three,...
> A simple solution to keep interface homogeneity is to return a tuple of size two Sure thing, I'll modify the code to do that. > is there a way...
Things added: - in-memory serialization of deep estimators. - test to save an estimator as a file. - in-memory serialization of deep regression estimators. Tests should still fail either because...
Apart from the ongoing discussion on saving/loading of DL models (issue #3022). The reason current PR fails to even save from `self.model_.save()` is due to `keras.layers.Lambda()` which is incompatible with...
> Quick question: how common is it to install `keras-self-attention` if you already have `tensorflow` or `keras`? Should this be a core dl dependency or just a soft dependency? I'm...
I see I was missing a warning-level soft dependency check for the network. I've added that, I think it should be good now. On a side note [`MatrixProfileTransformer`](https://github.com/sktime/sktime/blob/main/sktime/transformations/series/matrix_profile.py) has no...
Yes, we both are referring to the same dataset. I wanted to keep the size small as well, so I chose the covid dataset. > This PR only solves those...
Hello @fkiraly, I want to contribute to sktime and since this was marked as a "good first issue" and I'm a newcomer, I made a few changes in the structure....
I've dropped in a pull request as you mentioned, thanks for the link. I think I've found a bug but since I'm not sure I'm writing it here as a...