Large-Time-Series-Model
Large-Time-Series-Model copied to clipboard
Questions on Unified Time Series Dataset and Model Training Resources
Hello,
I'm truly impressed by your recent work on using pure decoder-based Transformer architectures for time series tasks. It’s great to see such innovative approaches being explored.
I have a couple of questions I’m hoping you could help with:
- Dataset Availability: You've mentioned the "Unified Time Series Dataset" in your research. Are there plans to make this dataset publicly available? If so, how and when can it be accessed?
- Resource Utilization for Model Training: You’ve discussed models of different sizes like 3M, 29M, and 51M parameters. Could you share details about the computational resources required for training these models? Specifically: What GPU resources were used? How long did the training take for each model size? Thanks for sharing your findings and I look forward to your response!
Best,