Home > Research > Publications & Outputs > Real-time Energy Management in Smart Homes thro...

Electronic data

  • Main Manuscript2b_accepted

    Accepted author manuscript, 1.16 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

Real-time Energy Management in Smart Homes through Deep Reinforcement Learning

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Real-time Energy Management in Smart Homes through Deep Reinforcement Learning. / Aldahmashi, Jamal; Ma, Xiandong.
In: IEEE Access, Vol. 12, 01.04.2024, p. 43155 - 43172.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Aldahmashi J, Ma X. Real-time Energy Management in Smart Homes through Deep Reinforcement Learning. IEEE Access. 2024 Apr 1;12:43155 - 43172. Epub 2024 Mar 13. doi: 10.1109/ACCESS.2024.3375771

Author

Bibtex

@article{8e9a5d7363fb4ae18bf7c654207c8b58,
title = "Real-time Energy Management in Smart Homes through Deep Reinforcement Learning",
abstract = "In light of the growing prevalence of distributed energy resources, energy storage systems (ESs), and electric vehicles (EVs) at the residential scale, home energy management (HEM) systems have become instrumental in amplifying economic advantages for consumers. These systems traditionally prioritize curtailing active power consumption, often at an expense of overlooking reactive power. A significant imbalance between active and reactive power can detrimentally impact the power factor in the home-to-grid interface. This research presents an innovative strategy designed to optimize the performance of HEM systems, ensuring they not only meet financial and operational goals but also enhance the power factor. The approach involves the strategic operation of flexible loads, meticulous control of thermostatic load in line with user preferences, and precise determination of active and reactive power values for both ES and EV. This optimizes cost savings and augments the power factor. Recognizing the uncertainties in user behaviors, renewable energy generations, and external temperature fluctuations, our model employs a Markov decision process for depiction. Moreover, the research advances a model-free HEM system grounded in deep reinforcement learning, thereby offering a notable proficiency in handling the multifaceted nature of smart home settings and ensuring real-time optimal load scheduling. Comprehensive assessments using real-world datasets validate our approach. Notably, the proposed methodology can elevate the power factor from 0.44 to 0.9 and achieve a significant 31.5% reduction in electricity bills, while upholding consumer satisfaction.",
keywords = "Adaptation models, Deep reinforcement learning, Electricity, Energy consumption, Energy management, Home appliances, Optimization, Power factor correction, Reactive power, Real-time systems, Scheduling, Smart homes, Uncertainty, appliances scheduling, deep reinforcement learning, home energy management, reactive power compensation, smart homes",
author = "Jamal Aldahmashi and Xiandong Ma",
year = "2024",
month = apr,
day = "1",
doi = "10.1109/ACCESS.2024.3375771",
language = "English",
volume = "12",
pages = "43155 -- 43172",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

RIS

TY - JOUR

T1 - Real-time Energy Management in Smart Homes through Deep Reinforcement Learning

AU - Aldahmashi, Jamal

AU - Ma, Xiandong

PY - 2024/4/1

Y1 - 2024/4/1

N2 - In light of the growing prevalence of distributed energy resources, energy storage systems (ESs), and electric vehicles (EVs) at the residential scale, home energy management (HEM) systems have become instrumental in amplifying economic advantages for consumers. These systems traditionally prioritize curtailing active power consumption, often at an expense of overlooking reactive power. A significant imbalance between active and reactive power can detrimentally impact the power factor in the home-to-grid interface. This research presents an innovative strategy designed to optimize the performance of HEM systems, ensuring they not only meet financial and operational goals but also enhance the power factor. The approach involves the strategic operation of flexible loads, meticulous control of thermostatic load in line with user preferences, and precise determination of active and reactive power values for both ES and EV. This optimizes cost savings and augments the power factor. Recognizing the uncertainties in user behaviors, renewable energy generations, and external temperature fluctuations, our model employs a Markov decision process for depiction. Moreover, the research advances a model-free HEM system grounded in deep reinforcement learning, thereby offering a notable proficiency in handling the multifaceted nature of smart home settings and ensuring real-time optimal load scheduling. Comprehensive assessments using real-world datasets validate our approach. Notably, the proposed methodology can elevate the power factor from 0.44 to 0.9 and achieve a significant 31.5% reduction in electricity bills, while upholding consumer satisfaction.

AB - In light of the growing prevalence of distributed energy resources, energy storage systems (ESs), and electric vehicles (EVs) at the residential scale, home energy management (HEM) systems have become instrumental in amplifying economic advantages for consumers. These systems traditionally prioritize curtailing active power consumption, often at an expense of overlooking reactive power. A significant imbalance between active and reactive power can detrimentally impact the power factor in the home-to-grid interface. This research presents an innovative strategy designed to optimize the performance of HEM systems, ensuring they not only meet financial and operational goals but also enhance the power factor. The approach involves the strategic operation of flexible loads, meticulous control of thermostatic load in line with user preferences, and precise determination of active and reactive power values for both ES and EV. This optimizes cost savings and augments the power factor. Recognizing the uncertainties in user behaviors, renewable energy generations, and external temperature fluctuations, our model employs a Markov decision process for depiction. Moreover, the research advances a model-free HEM system grounded in deep reinforcement learning, thereby offering a notable proficiency in handling the multifaceted nature of smart home settings and ensuring real-time optimal load scheduling. Comprehensive assessments using real-world datasets validate our approach. Notably, the proposed methodology can elevate the power factor from 0.44 to 0.9 and achieve a significant 31.5% reduction in electricity bills, while upholding consumer satisfaction.

KW - Adaptation models

KW - Deep reinforcement learning

KW - Electricity

KW - Energy consumption

KW - Energy management

KW - Home appliances

KW - Optimization

KW - Power factor correction

KW - Reactive power

KW - Real-time systems

KW - Scheduling

KW - Smart homes

KW - Uncertainty

KW - appliances scheduling

KW - deep reinforcement learning

KW - home energy management

KW - reactive power compensation

KW - smart homes

U2 - 10.1109/ACCESS.2024.3375771

DO - 10.1109/ACCESS.2024.3375771

M3 - Journal article

VL - 12

SP - 43155

EP - 43172

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -