Abstract
With dynamic renewable energy generation and power demand, microgrids (MGs) exchange energy with each other to reduce their dependence on power plants. In this article, we present a reinforcement learning (RL)-based MG energy trading scheme to choose the electric energy trading policy according to the predicted future renewable energy generation, the estimated future power demand, and the MG battery level. This scheme designs a deep RL-based energy trading algorithm to address the supply-demand mismatch problem for a smart grid with a large number of MGs without relying on the renewable energy generation and power demand models of other MGs. A performance bound on the MG utility and dependence on the power plant is provided. Simulation results based on a smart grid with three MGs using wind speed data from Hong Kong Observation and electricity prices from ISO New England show that this scheme significantly reduces the average power plant schedule and thus increases the MG utility in comparison with a benchmark methodology.
Original language | English (US) |
---|---|
Article number | 8839066 |
Pages (from-to) | 10728-10737 |
Number of pages | 10 |
Journal | IEEE Internet of Things Journal |
Volume | 6 |
Issue number | 6 |
DOIs | |
State | Published - Dec 2019 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Information Systems
- Hardware and Architecture
- Computer Science Applications
- Computer Networks and Communications
Keywords
- Energy trading
- power plant schedule
- reinforcement learning (RL)
- smart grids