In the field of power systems, the challenge of missing data can significantly impact operational efficiency and reliability. Recent studies have moved toward innovative solutions utilizing device integration and advanced deep learning techniques for data imputation. A notable experiment focusing on a box-meter integrated metering device provides valuable insights into this pressing issue. By simulating the data acquisition process of this device, researchers sought to realistically reflect the occurrences of missing data while simultaneously validating the effectiveness of various imputation methods.
The experiment addressed how models need to infer and recover potential temporal patterns from irregular and partially observed time series data. In a detailed analysis, experimental results were presented showing the performance of TimesNet alongside established baseline methods in the task of power data imputation. Data sets consisting of 321 user variables were methodically analyzed, allowing researchers to gauge the precision of the imputation based on different lengths of data and varying mask ratios. Specifically, the experiments employed sample lengths of 96 and 128, with mask ratios set at 12.5%, 25%, 37.5%, and 50%. This structured approach to imputation not only provides a thorough understanding of the effectiveness of different models but also illustrates the complexities inherent to the power data domain.
A significant focus was placed on the 321st variable within the dataset, culminating in a confidence interval analysis of its imputation results. The confidence interval serves as a robust statistical evaluation tool that assures the reliability of imputed values — specifically suggesting that there is a 95% probability that the absolute error between the imputed and actual values stays within half the interval width. This kind of rigorous scrutiny is critical for establishing trust in the effectiveness of the imputation methods employed, particularly in high-stakes environments such as power systems.
Within this framework, TimesNet, iTransformer, and DLinear were selected for comparative imputation experiments. Aiming to assess their performance in high missing value scenarios, the researchers employed an imputation step length of 96 and a mask rate of 50%. Results from the imputation were subsequently visualized, allowing for intuitive comparison. Black dashed lines represented true data trajectories, while blue dots illustrated missing values and red markers indicated the imputed results. This visual representation is not only informative but also crucial for understanding the nuances in performance across different methodologies.
The analysis revealed that the DLinear method appeared slightly less effective for tasks involving high missing value ratios compared to the advanced transformer-based methods such as iTransformer and the CNN-based TimesNet. The reason behind this disparity can largely be attributed to the limitations of Multi-Layer Perceptron (MLP) models, which typically focus on local features and may falter when attempting to capture the global temporal patterns essential for time series data, particularly in scenarios of extended missingness. As a result, the performance of DLinear can be compromised due to its inadequate modeling of the broader context from which the missing data stems.
In contrast, TimesNet demonstrated superior performance in the power imputation task, particularly under conditions of high missing rates. This model excels in capturing both temporal local features and multi-scale periodic variations through its convolutional layers. The ability to extract diverse features at varying time scales proved instrumental in accurately imputing missing power data, contrasting sharply with DLinear’s more limited modeling capabilities. This support for dynamic temporal features showcases TimesNet’s advantages in addressing variations present in time series data that other simpler methods may fail to account for.
Transformer-based architectures, especially the iTransformer, emerged as the most potent tools within the current time series imputation landscape. Their inherent strength lies in capturing global patterns and dependencies effectively while providing the computational efficiency afforded by parallel processing capabilities. In tests comparing recorded time series features from a dataset involving power data, TimesNet, iTransformer, and DLinear each exhibited unique strengths and weaknesses, emphasizing the need for tailored approaches based on the characteristics of the data at hand.
A pivotal finding reported that during the imputation process, TimesNet managed to deliver precise results, particularly in low-frequency segments, due to its robust frequency domain analysis mechanisms. Utilizing the fast Fourier transform (FFT), TimesNet can detect global periodic lengths with remarkable accuracy. Conversely, DLinear’s approach, characterized by smooth imputation trajectories, revealed notable system biases, especially in regions requiring more nuanced handling of local periodic conditions. The resultant imputation performance underscores the varying strengths and limitations of different approaches in accurately reconstructing time series data.
While DLinear struggled significantly in high missing rate scenarios, particularly with power data that exhibited complex periodicity, TimesNet demonstrated resilience by effectively capturing both local features and longer-term dependencies. The experiments illustrated that while transformer-based models like iTransformer show promise in managing sudden shifts in data, they occasionally display sensitivity to outliers that can distort imputed values beyond acceptable ranges. Such insights are pivotal for researchers and practitioners alike, who must remain vigilant in balancing accuracy with robustness in their models.
The overarching narrative emerging from this study highlights the effectiveness of integrating advanced metering devices with sophisticated imputation models to tackle the multi-temporal characteristics prevalent in power systems. TimesNet emerged as a robust tool capable of handling steady-state conditions adeptly, while iTransformer displayed proficiency in managing sudden dynamics such as equipment fluctuations. Both models demonstrate how contextual and environmental variations can be effectively addressed through the adoption of intelligent systems designed to impute missing data.
Notably, however, TimesNet is not without its challenges. The study revealed instances of lagging during high-frequency transient intervals, alongside minor phase drifts, indicating that the model would benefit from refined dynamic correction capabilities for various periodic components. Moreover, the complexities involved in decoupling transient high-frequency signals from steady-state conditions could further enhance imputation accuracy moving forward. Such findings present an opportunity for ongoing refinement in the methodologies employed within this domain, paving the way for more resilient and reliable approaches to power data imputation.
In conclusion, this rigorous exploration into the efficacy of different imputation methods within power systems underscores the pressing need to innovate in the face of ever-evolving data challenges. Whether through the lens of TimesNet’s convolutional advantages or the global capabilities presented by transformer architectures, the research signifies a crucial step forward in achieving greater data reliability in power systems, ultimately leading to improved operational outcomes in the real world.
Subject of Research: Power data imputation through device design and deep learning integration.
Article Title: Box-meter integrated solution for power data imputation through device design and deep learning integration.
Article References:
Gao, C., Lin, H. & Lin, Y. Box-meter integrated solution for power data imputation through device design and deep learning integration. Sci Rep 15, 36543 (2025). https://doi.org/10.1038/s41598-025-18439-3
Image Credits: AI Generated
DOI: 10.1038/s41598-025-18439-3
Keywords: power systems, data imputation, deep learning, TimesNet, iTransformer, DLinear, machine learning.