Chronos (pretrained model)

From Wikipedia, the free encyclopedia

Chronos is a framework for pretrained probabilistic time series models developed in 2024 by Amazon Web Services researchers.[1] By tokenizing time series values through scaling and quantization into a fixed vocabulary, Chronos utilizes existing transformer-based language model architectures, specifically training them via cross-entropy loss. The model variations within the Chronos family are based on the T5 family, with sizes ranging from 20 million to 710 million parameters.[citation needed]

Development[edit]

Chronos was pretrained on a broad array of publicly available datasets, alongside a synthetic dataset created through Gaussian processes to enhance its ability to generalize across different tasks. This approach allowed Chronos to undergo comprehensive pretraining, preparing it for a wide range of forecasting applications. Performance

In an extensive benchmark covering 42 datasets—which included both traditional local models and modern deep learning approaches—Chronos demonstrated notable achievements:

Training Corpus Performance: On datasets included in its training corpus, Chronos models significantly outperformed competing methods.

Zero-Shot Performance: When tested on new datasets, without specific training, Chronos displayed comparable or occasionally superior performance relative to models that were trained on those specific datasets.

Impact[edit]

The results from various benchmarks highlight Chronos's capability to leverage time series data from diverse domains. This enables it to improve zero-shot accuracy on unseen forecasting tasks markedly. The introduction and success of Chronos models mark a significant step forward, suggesting that pretrained models can serve as effective and simplified solutions in forecasting pipelines across a range of fields.

References[edit]

  1. ^ Ganaie, Muhammad Athar (2024-03-15). "Amazon AI Researchers Introduce Chronos: A New Machine Learning Framework for Pretrained Probabilistic Time Series Models". MarkTechPost. Retrieved 2024-03-22.