File size: 3,069 Bytes
f2ed354
c1b8789
 
f2ed354
c1b8789
 
 
 
 
 
f2ed354
 
c1b8789
 
5e57b91
c1b8789
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5e57b91
c1b8789
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
---
license: cc-by-nc-4.0
pipeline_tag: time-series-forecasting
tags:
  - time series
  - forecasting
  - pretrained models
  - foundation models
  - time series foundation models
  - time-series
---

# Moirai-2.0-R-Small

Moirai 2.0 is a decoder-only universal time series forecasting transformer model pre-trained on:
- Subset of [GIFT-Eval Pretrain](https://huggingface.co/datasets/Salesforce/GiftEvalPretrain), and [Train](https://huggingface.co/datasets/Salesforce/GiftEval) datasets (Non-leaking historical context). 
- Mixup data generated from non-leaking subsets of [Chronos Dataset](https://arxiv.org/abs/2403.07815).
- Synthetic time series produced via KernelSynth introduced in [Chronos paper](https://arxiv.org/abs/2403.07815).
- Internal Salesforce operational data.

We make significant improvements over the first version of Moirai (please refer to the [paper](https://arxiv.org/abs/2402.02592) for previous version):
- Switched from a distributional loss to a quantile loss formulation.
- Moved from single-token to multi-token prediction, improving efficiency and stability.
- Added a data filtering mechanism to filter out non-forecastable, low quality, time series during pretraining.
- Added a new patch token embedding which includes missing value information.
- Added patch-level random mask to improve robustness of the model during inference.

## Usage
To perform inference with Moirai 2.0, install the uni2ts library from our [GitHub repo](https://github.com/SalesforceAIResearch/uni2ts).

1. Clone repository:
```shell
git clone https://github.com/SalesforceAIResearch/uni2ts.git
cd uni2ts
```

2) Create virtual environment:
```shell
virtualenv venv
. venv/bin/activate
```

3) Build from source:
```shell
pip install -e '.[notebook]'
```

4) Create a `.env` file:
```shell
touch .env
```

A simple notebook to get started: [github_notebook_link](https://github.com/SalesforceAIResearch/uni2ts/blob/main/example/moirai_forecast.ipynb)

## Citation

If you're using any Moirai model or Uni2TS in your research or applications, please cite it using this BibTeX:

```markdown
@article{woo2024unified,
  title={Unified Training of Universal Time Series Forecasting Transformers},
  author={Woo, Gerald and Liu, Chenghao and Kumar, Akshat and Xiong, Caiming and Savarese, Silvio and Sahoo, Doyen},
  journal={arXiv preprint arXiv:2402.02592},
  year={2024}
}
```

## Ethical Considerations

This release is for research purposes only in support of an academic paper. 
Our models, datasets, and code are not specifically designed or evaluated for all downstream purposes. 
We strongly recommend users evaluate and address potential concerns related to accuracy, safety, and fairness before deploying this model. 
We encourage users to consider the common limitations of AI, comply with applicable laws, 
and leverage best practices when selecting use cases, particularly for high-risk scenarios where errors or misuse could significantly 
impact people’s lives, rights, or safety. For further guidance on use cases, refer to our AUP and AI AUP.