x-transformers icon indicating copy to clipboard operation
x-transformers copied to clipboard

Can the continous transformer autoregressive wrapper help with pre-training on time-series data?

Open Espritdelescalier opened this issue 1 year ago • 4 comments

Your work is an incredible resource for transformer architectures. They are well worn in the NLP domain and I have noticed their increase use with time-series.I have however also noticed a dearth of code and tools to pull from when applied to time-series.

I would be interested to try all the interesting goodies available on my time-series datasets, in particular pre-training and subsequent fine-tuning. How would one best go about it, if it is at all possible with the continous transformer?

Espritdelescalier avatar Aug 04 '22 13:08 Espritdelescalier

@Espritdelescalier Hey! So the truth is time series is one domain that transformers still struggle with (tabular data being the other)

You would be better off going with N-Beats if you are in industry and want something off the shelf. However, if you are a PhD student looking to see if attention can win in this arena, I would recommend starting with this baseline and then working your way up

lucidrains avatar Aug 04 '22 15:08 lucidrains

@Espritdelescalier you should try https://github.com/lucidrains/Mega-pytorch for time series

lucidrains avatar Oct 20 '22 15:10 lucidrains

We have had some success with transformers for time series: https://arxiv.org/abs/2208.14236

Froskekongen avatar Oct 25 '22 19:10 Froskekongen

@Espritdelescalier https://arxiv.org/abs/2211.14730

lucidrains avatar Dec 01 '22 18:12 lucidrains