Forecasting sequences by expert ensembles generally assumes stationary or near-stationary processes; however, in complex systems and many real-world applications, we are frequently confronted with nonstationarities, including both abrupt and gradual changes in distribution. We present an algorithm for forecasting non-stationary time series by combining predictions of a growing ensemble of models, adding new models as new data becomes available. Our approach modifies the exponentially weighted average forecaster [7] and fixed shares forecaster [3] to let the ensemble models grow, but retains the property of performing almost as well as an oracle which knows the optimal sequence of models to use. Additionally, we relate this to recent work in sequential anomaly detection using exponential-family models [9] and to the larger context of universal prediction.
Cosma Rohilla Shalizi, Abigail Z. Jacobs, Aaron Cl