Abstract
Battery storage systems (BSS) are critical for maintaining stability and flexibility in renewable energy microgrids. However, the generation and load profiles introduce challenges for optimal energy management. This study presents a deep reinforcement learning (DRL) approach to real-time battery scheduling in a grid-connected microgrid. The control problem is modeled as a Markov decision process (MDP), and a Deep Q-Network (DQN) agent is trained to output optimal charging and discharging actions based on system states. Results indicate that the proposed DQN-based EMS achieves superior performance in balancing economic cost, energy self-sufficiency, and battery health compared to conventional rule-based strategy. This study contributes to advancing reinforcement learning-based approaches for real-time battery scheduling in microgrid energy systems.