A Method for Modeling Time Delay-Related Measurement Errors, Applicable in Power and Energy Monitoring and in Fault Detection Algorithms for Energy Grids

This paper presents a proposed error model for the signal errors resulting from delays in the measurement chain. The analysis focuses on the uncertainty budget of the input quantities in the digital signal-processing algorithm, examining the influence of delays in the analog-to-digital processing st...

Full description

Saved in:
Bibliographic Details
Main Authors: Marian Kampik, Łukasz Dróżdż, Jerzy Roj
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Energies
Subjects:
Online Access:https://www.mdpi.com/1996-1073/18/13/3524
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a proposed error model for the signal errors resulting from delays in the measurement chain. The analysis focuses on the uncertainty budget of the input quantities in the digital signal-processing algorithm, examining the influence of delays in the analog-to-digital processing stage on the uncertainty of these quantities. A classification of delays, based on their properties and the nature of the error signal they generate, is proposed. In addition to the mathematical model and its verification through the Monte Carlo method, the paper discusses potential applications of the proposed analysis method for selected implementations of measurement chains. This method can be utilized for power and energy measurement, power grid fault detection, and the identification of short circuits, among other applications.
ISSN:1996-1073