The Role of Reinforcement Learning Control for Optimizing Building Energy Management Systems
NGUYEN, DUC HOANG (2024-12-09)
NGUYEN, DUC HOANG
09.12.2024
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe20241209100416
https://urn.fi/URN:NBN:fi-fe20241209100416
Tiivistelmä
Modern energy management systems are increasingly challenged by the complexity, uncertainty, and high-dimensional data generated by advanced power systems. These challenges have driven a growing interest in integrating intelligent techniques such as machine learning (ML) and deep learning (DL) into energy management to improve system adaptability and efficiency. Among these approaches, reinforcement learning (RL) has emerged as a promising solution due to its ability to handle dynamic, sequential decision-making processes under uncertainty. RL has demonstrated its potential not only in energy management but also in related areas such as demand response, operational control, and renewable energy integration. This research delves into the development of RL-based frameworks for intelligent energy management in grid-interactive buildings, with a special focus on the integration of electric vehicles (EVs) as distributed energy resources (DERs). Firstly, the thesis carries out a systematic review of the foundational principles of RL and its diverse applications within the domain of power systems.
Subsequently, a data-driven framework leveraging the Soft Actor-Critic RL algorithm is proposed to enable prosumers to reduce energy costs, enhance grid stability, improve renewable energy utilization, and maintain user comfort. Simulation results highlight the effectiveness of the proposed framework, showing significant performance gains over state-of-the-art control strategies in terms of cost efficiency, CO₂ emissions reduction, and grid resilience. Additionally, the study provides a critical evaluation of the practical challenges and opportunities of implementing RL-based systems in real-world scenarios. The insights gained highlight the transformative potential of RL in enabling adaptive and sustainable energy management practices. By addressing both the technical complexities and real-world applications, this research advances the understanding of intelligent energy systems and underscores the importance of RL in meeting the growing demands of modern energy infrastructures while promoting sustainability and economic viability.
Subsequently, a data-driven framework leveraging the Soft Actor-Critic RL algorithm is proposed to enable prosumers to reduce energy costs, enhance grid stability, improve renewable energy utilization, and maintain user comfort. Simulation results highlight the effectiveness of the proposed framework, showing significant performance gains over state-of-the-art control strategies in terms of cost efficiency, CO₂ emissions reduction, and grid resilience. Additionally, the study provides a critical evaluation of the practical challenges and opportunities of implementing RL-based systems in real-world scenarios. The insights gained highlight the transformative potential of RL in enabling adaptive and sustainable energy management practices. By addressing both the technical complexities and real-world applications, this research advances the understanding of intelligent energy systems and underscores the importance of RL in meeting the growing demands of modern energy infrastructures while promoting sustainability and economic viability.