We've Moved!
Visit SDSU’s new digital collections website at https://digitalcollections.sdsu.edu
Description
This thesis introduces the smart grid concept and how it has improved the electric grid operation over the past decade. The communication infrastructures have improved the ability to access live data operational data points. These data sets provide the pool of historical data, which is one of the core requirements of applying Artificial Intelligence (AI) to improve today’s electric system. Different branches of AI have been used to improve the electric system by providing benefits such as predicting device malfunctions, weather data, system outages, and load demand. This thesis focuses on using reinforcement learning to design an efficient energy management system for grid-tied energy storage systems (ESS). The second section introduces different reinforcement learning (RL) approaches and explores each approach’s requirements, challenges, advantages, and disadvantages. The two main concepts studied in this research work were Q learning and deep Q Learning (DQN). A DQN using artificial neural networks (ANN) is suggested to design a microgrid controller system. The proposed on-grid microgrid controller is designed to coordinate between the main electric grid, aggregated loads, renewable generations, and advanced energy systems. The third section dives into the technical requirements for designing the proposed controller. The environment, DQN agent setup, and reward formulation are explained in detail. PSCAD dynamic power simulation is used to model the power system used in this study. Furthermore, the issue of moving the target as part of DQN is discussed, and a solution is suggested to resolve the dependency of the expected and target Q values. Chapter four explains the test case studies used to evaluate the model’s performance. The three main objectives were defined as voltage regulation, minimizing the cost of operation based on market price reward, and AES operation to meet demand response objective based on net generation daily. In addition, the proposed agent was tested against extreme conditions. The last chapter summarizes the problem’s description and the approaches to developing an energy management controller for grid-connected advanced energy storage. Furthermore, the next steps are discussed to improve the results of the designed system further.