abdulfatir commited on
Commit
c615a9c
1 Parent(s): d968d90

Update README

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -12,6 +12,8 @@ tags:
12
 
13
  # Chronos-T5 (Tiny)
14
 
 
 
15
  Chronos is a family of **pretrained time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
16
 
17
  For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
@@ -88,10 +90,12 @@ If you find Chronos models useful for your research, please consider citing the
88
 
89
  ```
90
  @article{ansari2024chronos,
91
- author = {Ansari, Abdul Fatir and Stella, Lorenzo and Turkmen, Caner and Zhang, Xiyuan, and Mercado, Pedro and Shen, Huibin and Shchur, Oleksandr and Rangapuram, Syama Syndar and Pineda Arango, Sebastian and Kapoor, Shubham and Zschiegner, Jasper and Maddix, Danielle C. and Mahoney, Michael W. and Torkkola, Kari and Gordon Wilson, Andrew and Bohlke-Schneider, Michael and Wang, Yuyang},
92
- title = {Chronos: Learning the Language of Time Series},
93
- journal = {arXiv preprint arXiv:2403.07815},
94
- year = {2024}
 
 
95
  }
96
  ```
97
 
 
12
 
13
  # Chronos-T5 (Tiny)
14
 
15
+ 🚀 **Update Nov 27, 2024**: We have released Chronos-Bolt⚡️ models that are more accurate (5% lower error), up to 250 times faster and 20 times more memory-efficient than the original Chronos models of the same size. Check out the new models [here](https://huggingface.co/amazon/chronos-bolt-tiny).
16
+
17
  Chronos is a family of **pretrained time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
18
 
19
  For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
 
90
 
91
  ```
92
  @article{ansari2024chronos,
93
+ title={Chronos: Learning the Language of Time Series},
94
+ author={Ansari, Abdul Fatir and Stella, Lorenzo and Turkmen, Caner and Zhang, Xiyuan, and Mercado, Pedro and Shen, Huibin and Shchur, Oleksandr and Rangapuram, Syama Syndar and Pineda Arango, Sebastian and Kapoor, Shubham and Zschiegner, Jasper and Maddix, Danielle C. and Mahoney, Michael W. and Torkkola, Kari and Gordon Wilson, Andrew and Bohlke-Schneider, Michael and Wang, Yuyang},
95
+ journal={Transactions on Machine Learning Research},
96
+ issn={2835-8856},
97
+ year={2024},
98
+ url={https://openreview.net/forum?id=gerNCVqqtR}
99
  }
100
  ```
101