Exploring Gegenbauer Autoregressive Moving Average (GARMA) Models in Time Series Analysis: A Tool for Long Memory Data
Introduction
With the vast amount of time series data being generated across the globe, from financial markets and cryptocurrencies to climate and environmental sciences, effective modeling has become essential. Time series models play a crucial role in identifying future patterns, forecasting values, and uncovering the factors that influence these dynamic processes.
While modern machine learning and deep learning models are increasingly applied in time series analysis, their lack of interpretability often limits their practical use. As a result, classical time series models such as AR, MA, ARIMA, and SARIMA remain widely used due to their transparency and ease of interpretability. However, these traditional models typically fail to capture long-term dependencies in the data, that is, relationships between observations that are far apart in time. This is where long memory time series models become valuable. Among them, Autoregressive Fractionally Integrated Moving Average (ARFIMA) and Generalized Autoregressive Moving Average (GARMA) models are particularly effective in modeling such dependencies.
Why Do We Need GARMA Models?
From SARIMA to GARMA: The Evolution
The SARIMA(p, d, q)(P, D, Q, s) model, short for Seasonal Autoregressive Integrated Moving Average, has long been a cornerstone of time series analysis. It captures both the dependence of current values on past observations (the AR part), the influence of past random shocks (the MA part), and the seasonality. However, real-world data often exhibit long-memory behavior, in which correlations decay slowly over time rather than vanishing quickly as SARIMA models assume. To address this, researchers introduced fractional differencing, giving rise to the GARMA model. GARMA allows the differencing parameter to take fractional rather than integer values, while avoiding over-differentiation. By incorporating Gegenbauer polynomials, it also models seasonal and cyclical patterns simultaneously, making it ideal for time series with both long memory and periodic behavior.
The GARMA Model Specification
- B is the lag operator.
- μ : Mean of the time series (if the series is not mean-centered)
- : Represents the short-memory Autoregressive component of order p (to estimate p, PACF can be used as in ARIMA models)
- θ(B) : Represents the short-memory Moving Average component of order q (to estimate p, ACF can be used as in ARIMA models)
- (1 - 2uiB + B2)di : Gegenbauer factor (with ui = cos(λi)) where λi 's are Gegenbauer frequencies (These frequencies can be estimated using the spectral density function plot of the time series).
- di : Represents the degree of fractional differencing (This needs to be estimated by methods like Whittle likelihood approximation.)
- id : Represents the degree of integer differencing (Usually this is the order of the integration of the time series)
- εt : Residual term
Estimation and Implementation
References / Further Reading
-
Hunt, M. R. (2025). Package ‘ garma .’ https://doi.org/10.1007/s00362-022-01290-3'
-
Peiris, S., Allen, D. E., & Hunt, R. (2025). Optimal Time Series Forecasting Through the GARMA Model. 1–23.
-
Ferrara, L., & Guégan, D. (2006). Fractional seasonality : Models and Application to Economic Activity in the Euro Area ∗. June, 1–24.
Comments
Post a Comment