Logo image
Online Reinforcement Learning Approach for Real-Time Energy Management of Grid-Connected Battery Storage Microgrid System
Conference paper

Online Reinforcement Learning Approach for Real-Time Energy Management of Grid-Connected Battery Storage Microgrid System

Xiangru Shi, Flavie Didier, Abderrezak Badji, Sheikh Izzal Azid, Maurizio Cirrincione and Salah Laghrouche
IECON 2025 – 51st Annual Conference of the IEEE Industrial Electronics Society, pp.1-7
IEEE
51st Annual Conference of the IEEE Industrial Electronics Society (IECON 2025) (Madrid, Spain, 14/10/2025–17/10/2025)
14/10/2025

Abstract

Batteries Battery Storage System Deep reinforcement learning Energy Management System (EMS) Grid-connected Microgrid Job shop scheduling Load modeling Markov decision processes Medical services Microgrids Power system stability Real-time systems Stability analysis
Battery storage systems (BSS) are critical for maintaining stability and flexibility in renewable energy microgrids. However, the generation and load profiles introduce challenges for optimal energy management. This study presents a deep reinforcement learning (DRL) approach to real-time battery scheduling in a grid-connected microgrid. The control problem is modeled as a Markov decision process (MDP), and a Deep Q-Network (DQN) agent is trained to output optimal charging and discharging actions based on system states. Results indicate that the proposed DQN-based EMS achieves superior performance in balancing economic cost, energy self-sufficiency, and battery health compared to conventional rule-based strategy. This study contributes to advancing reinforcement learning-based approaches for real-time battery scheduling in microgrid energy systems.

Details

Metrics

7 Record Views
Logo image