tsai
tsai copied to clipboard
Use PyTorch model in plain PyTorch after training
Hi, I just came across your package and it really looks amazing, especially the implementation of the TimeSeriesTransformer is exactly what I was looking for. However, I have a question:
I would like to train the TimeSeriesTransformer as a classifier, but once the training is finished, I would like to use it also a the Encoder of an Autoencoder. In plain PyTorch this is rather simple. I was wondering if its possible to train your implementation of the TimeSeriesTransformer within the tsai framework (as in the tutorial notebooks), but after training access the trained model (or its weights) in plain PyTorch, such that further experiments like the one I am suggesting can be performed?
Thank you already and best, Felix
Hi,
With learn.model you access the Pytorch model (instance of nn.Module) that has been trained in the learning process.
@vrodriguezf
but using learn.model.eval()(X) doesn't give you the same results as learn.get_X_preds(X),
which might mean that the we have 2 different weights sets for each function.
So for example:
preds=learn.model.eval()(X)
y_pred1=preds.max(dim=-1)
_,_, y_pred2 = learn.get_X_preds(X)
y_pred1 does not equal y_pred2
Hi @Rabea007, Bear in mind that X may undergo some transforms before it is passed to the model. Those transforms are applied by the dataset (item transforms) or the dataloder (batch tfms). So in your example y_pred1 may not be correct. It's important to ensure during inference the data is transformed in the same way as the validation set during training. That occurs if you use learn.get_X_preds.
I'll close this issue due to the lack of response.