Google Research Releases TimesFM 2.5: A 200M Parameter Time Series Foundation Model for Forecasting
#Machine Learning

Google Research Releases TimesFM 2.5: A 200M Parameter Time Series Foundation Model for Forecasting

Startups Reporter
2 min read

Google Research has unveiled TimesFM 2.5, a 200M parameter decoder-only foundation model for time series forecasting that supports up to 16k context length and continuous quantile predictions up to 1k horizon.

Google Research has released TimesFM 2.5, the latest iteration of their time series foundation model designed for forecasting applications. The model represents a significant upgrade from its predecessor, TimesFM 2.0, with optimizations that reduce parameters from 500M to 200M while expanding context length capabilities from 2048 to 16,000 tokens.

TimesFM 2.5 introduces several key improvements that enhance its forecasting capabilities. The model now supports continuous quantile forecasting up to a 1,000-step horizon through an optional 30M quantile head, allowing for more nuanced probabilistic predictions. Additionally, the frequency indicator has been removed, and several new forecasting flags have been added to provide users with greater control over model behavior.

The model is available through multiple channels, including the official TimesFM Hugging Face Collection and as an integrated feature in Google BigQuery. While this open-source version is not an officially supported Google product, the company has made it accessible for broader research and development purposes.

For developers looking to implement TimesFM 2.5, the installation process is straightforward. Users can clone the repository from GitHub and set up a virtual environment using uv, with support for both PyTorch and Flax backends. The package includes optional dependencies for XReg covariate support, making it flexible for various forecasting scenarios.

A practical code example demonstrates the model's usage with PyTorch. After setting high float32 matrix multiplication precision, users can load the pretrained model and configure it with specific forecasting parameters. The example shows how to generate both point forecasts and quantile forecasts across multiple input series, with the quantile forecast providing predictions from the 10th to 90th percentiles.

The TimesFM 2.5 release represents Google Research's continued investment in foundation models for specialized domains. By focusing on time series forecasting, the team has created a model that addresses the unique challenges of temporal data prediction, including handling long sequences and providing probabilistic outputs that are crucial for decision-making in fields like finance, energy, and supply chain management.

Looking ahead, the TimesFM team has indicated that the repository will undergo further development to support an upcoming Flax version for faster inference, add back full covariate support, and expand documentation and examples. This ongoing development suggests that TimesFM will continue to evolve as a key tool in the time series forecasting ecosystem.

For those interested in exploring the technical details, the full paper "A decoder-only foundation model for time-series forecasting" was presented at ICML 2024, providing the theoretical foundation for the model's architecture and training approach.

Featured image: TimesFM 2.5 model architecture and capabilities

Comments

Loading comments...