x-transformers
x-transformers copied to clipboard
Can the continous transformer autoregressive wrapper help with pre-training on time-series data?
Your work is an incredible resource for transformer architectures. They are well worn in the NLP domain and I have noticed their increase use with time-series.I have however also noticed a dearth of code and tools to pull from when applied to time-series.
I would be interested to try all the interesting goodies available on my time-series datasets, in particular pre-training and subsequent fine-tuning. How would one best go about it, if it is at all possible with the continous transformer?
@Espritdelescalier Hey! So the truth is time series is one domain that transformers still struggle with (tabular data being the other)
You would be better off going with N-Beats if you are in industry and want something off the shelf. However, if you are a PhD student looking to see if attention can win in this arena, I would recommend starting with this baseline and then working your way up
@Espritdelescalier you should try https://github.com/lucidrains/Mega-pytorch for time series
We have had some success with transformers for time series: https://arxiv.org/abs/2208.14236
@Espritdelescalier https://arxiv.org/abs/2211.14730