Skip to main navigation menu Skip to main content Skip to site footer

Trend-Fluctuation Decomposition with Deep Residual Networks for System Forecasting

Abstract

This study proposes a deep residual network-based forecasting method to address the challenges of high-dimensional dynamics, complex dependencies, and multi-scale evolution in backend system time series prediction, aiming to improve prediction accuracy and stability under non-stationary workloads, resource fluctuations, and multi-tenant competition. The method introduces a trend-fluctuation decomposition mechanism to decouple and model long-term trends and short-term variations in time series, effectively enhancing the characterization of behavioral patterns across different time scales. A feature fusion module is employed to jointly represent multi-source metrics and contextual information, improving the model's adaptability to system state changes. In addition, a multi-scale attention mechanism aggregates features across temporal granularities and adjusts their weights, further strengthening the capture of key contextual dependencies. Structurally, the residual network provides a deep feature propagation pathway, ensuring gradient stability and enhancing the model's nonlinear representation capacity. From a learning perspective, sensitivity analyses on key factors such as pseudo-label ratio, anomaly contamination rate, and decomposition coefficient verify the model's robustness under varying data quality and environmental conditions. Experimental results demonstrate that the proposed method outperforms mainstream time series forecasting models across multiple metrics, achieving low-error and high-robustness performance in scenarios with heterogeneous metrics, complex dependency structures, and highly dynamic environments. This provides a reliable technical foundation for tasks such as backend service optimization, resource scheduling, and anomaly detection.

pdf