awesome-deep-time-series-representations
awesome-deep-time-series-representations copied to clipboard
A curated list of state-of-the-art papers on deep learning for universal representations of time series.
Awesome Deep Time-Series Representations
This is a repository to help all readers who are interested in learning universal representations of time series with deep learning. If your papers are missing or you have other requests, please post an issue, create a pull request, or contact [email protected]. We will update this repository at a regular basis in accordance with the top-tier conference publication cycles to maintain up-to-date.
Next Batch: KDD 2024, ICDM 2024, CIKM 2024, NeurIPS 2024
Accompanying Paper: Universal Time-Series Representation Learning: A Survey
@article{trirat2024universal,
title={Universal Time-Series Representation Learning: A Survey},
author={Patara Trirat and Yooju Shin and Junhyeok Kang and Youngeun Nam and Jihye Na and Minyoung Bae and Joeun Kim and Byunghyun Kim and Jae-Gil Lee},
journal={arXiv preprint arXiv:2401.03717},
year={2024}
}
Proposed Taxonomy
Contents
Related Surveys (Latest Update: June, 2024)
Time-Series Data Mining and Analysis
Representation Learning
Research Papers (Latest Update: ICML 2024)
Neural Architectural Approaches
Studies in this group focus on the novel design of neural architectures by combining basic building blocks or redesigning a neural architecture from scratch to improve the capability of capturing temporal dependencies and inter-relationships between variables of multivariate time series. We can further categorize the studies into the basic block combination and innovative redesign categories based on the degree of architecture adjustment.
Learning-Focused Approaches
Studies in this category focus on devising novel objective functions or pretext tasks used for the representation learning process, i.e., model training. The learning objectives can be categorized into supervised, unsupervised, or self-supervised learning, depending on the use of labeled instances. The difference between unsupervised and self-supervised learning is the presence of pseudo labels. Specifically, unsupervised learning is based on the reconstruction of its input, while self-supervised learning uses pseudo labels as self-supervision signals.