tsai icon indicating copy to clipboard operation
tsai copied to clipboard

Use PyTorch model in plain PyTorch after training

Open fewagner opened this issue 3 years ago • 2 comments

Hi, I just came across your package and it really looks amazing, especially the implementation of the TimeSeriesTransformer is exactly what I was looking for. However, I have a question:

I would like to train the TimeSeriesTransformer as a classifier, but once the training is finished, I would like to use it also a the Encoder of an Autoencoder. In plain PyTorch this is rather simple. I was wondering if its possible to train your implementation of the TimeSeriesTransformer within the tsai framework (as in the tutorial notebooks), but after training access the trained model (or its weights) in plain PyTorch, such that further experiments like the one I am suggesting can be performed?

Thank you already and best, Felix

fewagner avatar Jul 28 '22 07:07 fewagner

Hi,

With learn.model you access the Pytorch model (instance of nn.Module) that has been trained in the learning process.

vrodriguezf avatar Jul 29 '22 13:07 vrodriguezf

@vrodriguezf

but using learn.model.eval()(X) doesn't give you the same results as learn.get_X_preds(X), which might mean that the we have 2 different weights sets for each function. So for example:

preds=learn.model.eval()(X)
y_pred1=preds.max(dim=-1)
_,_, y_pred2 = learn.get_X_preds(X)  

y_pred1 does not equal y_pred2

Rabea007 avatar Aug 04 '22 06:08 Rabea007

Hi @Rabea007, Bear in mind that X may undergo some transforms before it is passed to the model. Those transforms are applied by the dataset (item transforms) or the dataloder (batch tfms). So in your example y_pred1 may not be correct. It's important to ensure during inference the data is transformed in the same way as the validation set during training. That occurs if you use learn.get_X_preds.

oguiza avatar Nov 21 '22 12:11 oguiza

I'll close this issue due to the lack of response.

oguiza avatar Dec 05 '22 10:12 oguiza