Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction

Non-Intrusive Load Monitoring (NILM) is a method to decompose overall electricity consumption into individual appliance-level data, utilizing the primary meter’s readings without additional sensors on each device. This article introduces a novel approach which is a Neural Network-Aided NILM (NNAN),...

Full description

Saved in:
Bibliographic Details
Main Authors: Yacine Belguermi, Patrice Wira, Gilles Hermann
Format: Article
Language:English
Published: Elsevier 2025-06-01
Series:Machine Learning with Applications
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2666827025000507
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850222471643398144
author Yacine Belguermi
Patrice Wira
Gilles Hermann
author_facet Yacine Belguermi
Patrice Wira
Gilles Hermann
author_sort Yacine Belguermi
collection DOAJ
description Non-Intrusive Load Monitoring (NILM) is a method to decompose overall electricity consumption into individual appliance-level data, utilizing the primary meter’s readings without additional sensors on each device. This article introduces a novel approach which is a Neural Network-Aided NILM (NNAN), focusing on revealing appliance consumption patterns by following a sequential subtraction method. Our goal is to tackle the issue where high-power and highly-used appliances make it difficult for neural networks to accurately separate the usage of lower-power and less-used appliances. We mainly employ Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNN) using inception blocks as key components. Our proposed architecture is validated on three public datasets that are AMPds2, ECO and UK-DALE. The NNAN model showed promising results, achieving disaggregation accuracy improvements of up to 5.13% on AMPds2, 3.79% on ECO, and 9.55% on UK-DALE compared to the reference methods. Additionally, NNAN reduces model complexity, requiring up to 74% fewer parameters than traditional deep learning approaches, leading to improved computational efficiency. Finally, NNAN demonstrated a reduced correlation between appliance usage rates and disaggregation accuracies.
format Article
id doaj-art-b373bb9d911f4a9da9594ffe2017f8ec
institution OA Journals
issn 2666-8270
language English
publishDate 2025-06-01
publisher Elsevier
record_format Article
series Machine Learning with Applications
spelling doaj-art-b373bb9d911f4a9da9594ffe2017f8ec2025-08-20T02:06:20ZengElsevierMachine Learning with Applications2666-82702025-06-012010066710.1016/j.mlwa.2025.100667Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtractionYacine Belguermi0Patrice Wira1Gilles Hermann2Institut de Recherche en Informatique, Mathématiques, Automatique et Signal, Université de Haute Alsace, Mulhouse, FranceCorresponding author.; Institut de Recherche en Informatique, Mathématiques, Automatique et Signal, Université de Haute Alsace, Mulhouse, FranceInstitut de Recherche en Informatique, Mathématiques, Automatique et Signal, Université de Haute Alsace, Mulhouse, FranceNon-Intrusive Load Monitoring (NILM) is a method to decompose overall electricity consumption into individual appliance-level data, utilizing the primary meter’s readings without additional sensors on each device. This article introduces a novel approach which is a Neural Network-Aided NILM (NNAN), focusing on revealing appliance consumption patterns by following a sequential subtraction method. Our goal is to tackle the issue where high-power and highly-used appliances make it difficult for neural networks to accurately separate the usage of lower-power and less-used appliances. We mainly employ Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNN) using inception blocks as key components. Our proposed architecture is validated on three public datasets that are AMPds2, ECO and UK-DALE. The NNAN model showed promising results, achieving disaggregation accuracy improvements of up to 5.13% on AMPds2, 3.79% on ECO, and 9.55% on UK-DALE compared to the reference methods. Additionally, NNAN reduces model complexity, requiring up to 74% fewer parameters than traditional deep learning approaches, leading to improved computational efficiency. Finally, NNAN demonstrated a reduced correlation between appliance usage rates and disaggregation accuracies.http://www.sciencedirect.com/science/article/pii/S2666827025000507NILMNeural networksEnergy disaggregationInception CNNLSTM
spellingShingle Yacine Belguermi
Patrice Wira
Gilles Hermann
Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
Machine Learning with Applications
NILM
Neural networks
Energy disaggregation
Inception CNN
LSTM
title Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
title_full Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
title_fullStr Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
title_full_unstemmed Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
title_short Neural Network-Aided NILM (NNAN) disaggregation: Revealing appliance consumption patterns with iterative subtraction
title_sort neural network aided nilm nnan disaggregation revealing appliance consumption patterns with iterative subtraction
topic NILM
Neural networks
Energy disaggregation
Inception CNN
LSTM
url http://www.sciencedirect.com/science/article/pii/S2666827025000507
work_keys_str_mv AT yacinebelguermi neuralnetworkaidednilmnnandisaggregationrevealingapplianceconsumptionpatternswithiterativesubtraction
AT patricewira neuralnetworkaidednilmnnandisaggregationrevealingapplianceconsumptionpatternswithiterativesubtraction
AT gilleshermann neuralnetworkaidednilmnnandisaggregationrevealingapplianceconsumptionpatternswithiterativesubtraction