A common technique to deploy linear prediction to nonstationary signals is time segmentation and local analysis. In [1], the temporal changes of linear prediction coefficients (LPCs) are modeled as a Fourier series. This allows analysis and optimization of larger speech segments, i.e., virtually global analysis. Possibly resulting non-minimum-phase prediction error polynomials are subject to all-pass filtering. We show that introducing the stabilizing filter does not deteriorate the overall predictor performance.