An Improved Minimax-Q Algorithm Based on Generalized Policy Iteration to Solve a Chaser-Invader Game
Liu MS(刘民颂)1,2; Zhu YH(朱圆恒)1,2; Zhao DB(赵冬斌)1,2
2020-07
会议名称International Joint Conference on Neural Networks
会议日期2020-5
会议地点线上
摘要

In this paper, we use reinforcement learning and zero-sum games to solve a Chaser-Invader game, which is actually a Markov game (MG). Different from the single agent Markov Decision Process (MDP), MG can realize the interaction of multiple agents, which is an extension of game theory to a MDP environment. This paper proposes an improved algorithm based on the classical Minimax-Q algorithm. First, in order to solve the problem where Minimax-Q algorithm can only be applied for discrete and simple environment, we use Deep Q-network instead of traditional Q-learning. Second, we propose a generalized policy iteration to solve the zero-sum game. This method makes the agent use linear programming method to solve the Nash equilibrium action at each moment. Finally, through comparative experiments, we prove that the improved algorithm can perform as well as Monte Carlo Tree Search in simple environments and better than Monte Carlo Tree Search in complex environments.

收录类别EI
七大方向——子方向分类智能控制
国重实验室规划方向分类智能计算与学习
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/58505
专题多模态人工智能系统全国重点实验室_深度强化学习
通讯作者Zhao DB(赵冬斌)
作者单位1.中国科学院自动化研究所
2.中国科学院大学人工智能学院
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Liu MS,Zhu YH,Zhao DB. An Improved Minimax-Q Algorithm Based on Generalized Policy Iteration to Solve a Chaser-Invader Game[C],2020.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
An Improved Minimax-(727KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Liu MS(刘民颂)]的文章
[Zhu YH(朱圆恒)]的文章
[Zhao DB(赵冬斌)]的文章
百度学术
百度学术中相似的文章
[Liu MS(刘民颂)]的文章
[Zhu YH(朱圆恒)]的文章
[Zhao DB(赵冬斌)]的文章
必应学术
必应学术中相似的文章
[Liu MS(刘民颂)]的文章
[Zhu YH(朱圆恒)]的文章
[Zhao DB(赵冬斌)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: An Improved Minimax-Q Algorithm based on Generalized Policy Iteration to Solve a Chaser-Invader Game.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。