Detailed Record



A Model-Free Approach for Load Frequency Control Using Deep Reinforcement Learning


Abstract Frequency stability is crucial for the proper operation of the power grids. The changing landscapes in power systems, raised by continuously increasing demand and rapid decommissioning of conventional generation, have changed the frequency dynamics significantly. Load frequency control (LFC) is a mechanism for regulating frequency and ensuring stability by balancing supply and load. Conventional control strategies, which require accurate representation of the mathematical methods, can face difficulty in modeling the complexity of power dynamics. Therefore, this paper proposes a model-free strategy utilizing deep reinforcement learning (DRL) for implementing the LFC mechanism without the involvement of a central controlling. The problem is formulated as a deep deterministic policy gradient (DDPG) to find an optimal solution for regulating frequency control operations at primary and secondary stages. The proposed framework enables the generating unit, acting as an agent, to identify the optimal actions for maintaining system stability by analyzing generation and load patterns. The DRL-based method utilizes the test signal to change the dynamics of the system, and the agent modeled using the recurrent neural network (RNN) framework inside DDPG learns to balance the generation according to the load and ensures restoration of the deviated frequency. The performance of the implemented method is validated with the proportional and integral (PI) controller.
Authors Bimal Pandey University of Wyoming , Nga Nguyen University of WyomingORCID
Journal Info | , pages: 1 - 6
Publication Date 3/6/2025
ISSN
TypeKeyword Image article
Open Access closed Closed Access
DOI https://doi.org/10.1109/tpec63981.2025.10907186
KeywordsKeyword Image