Realistic Peer-to-Peer Energy Trading Model for Microgrids Using Deep Reinforcement Learning

Chen, T. and Bu, S. (2019) Realistic Peer-to-Peer Energy Trading Model for Microgrids Using Deep Reinforcement Learning. In: 2019 IEEE PES Innovative Smart Grid Technologies Europe (ISGT-Europe), Bucharest, Romania, 29 Sep - 02 Oct 2019, ISBN 9781538682180 (doi: 10.1109/ISGTEurope.2019.8905731)

190489.pdf - Accepted Version



In this paper, we integrate deep reinforcement learning with our realistic peer-to-peer (P2P) energy trading model to address a decision-making problem for microgrids (MGs) in the local energy market. First, an hour-ahead P2P energy trading model with a set of critical physical constraints is formed. Then, the decision-making process of energy trading is built as a Markov decision process, which is used to find the optimal strategies for MGs using a deep reinforcement learning (DRL) algorithm. Specifically, a modified deep Q-network (DQN) algorithm helps the MGs to utilise their resources and make better strategies. Finally, we choose several real-world electricity data sets to perform the simulations. The DQN-based energy trading strategies improve the utilities of the MGs and significantly reduce the power plant schedule with a virtual penalty function. Moreover, the model can determine the best battery for the selected MG. The results show that this P2P energy trading model can be applied to real-world situations.

Item Type:Conference Proceedings
Glasgow Author(s) Enlighten ID:Chen, Tianyi and Bu, Dr Shengrong
Authors: Chen, T., and Bu, S.
College/School:College of Science and Engineering > School of Engineering > Systems Power and Energy
Copyright Holders:Copyright © 2019 IEEE
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record