CASIA OpenIR  > 09年以前成果
Infinitely Many-Armed Bandits with Budget Constraints
Li HF(李海芳)1; Yingce Xia2; Haifang Li
2016-11-28
会议名称The 31st AAAI Conference on Artificial Intelligence
会议日期2017/2/4-2017/2/9
会议地点San Francisco, California USA
摘要We study the infinitely many-armed bandit problem with budget constraints, where the number of arms can be infinite and much larger than the number of possible experiments. The player aims at maximizing his/her total expected reward under a budget constraint B for the cost of pulling arms. We introduce a weak stochastic assumption on the ratio of expected-reward to expected-cost of a newly pulled arm which characterizes its probability of being a near-optimal arm. We propose an algorithm named RCB-I to this new problem, in which the player first randomly picks K arms, whose order is sub-linear in terms of B, and then runs the algorithm for the finite-arm setting on the selected arms. Theoretical analysis shows that this simple algorithm enjoys a sub-linear regret in term of the budget B. We also provide a lower bound of any algorithm under Bernoulli setting. The regret bound of RCB-I matches the lower bound up to a logarithmic factor. We further extend this algorithm to the any-budget setting (i.e., the budget is unknown in advance) and conduct corresponding theoretical analysis.
关键词Multi-armed Bandits Budget Constraints Infinitely Many Arms
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/15670
专题09年以前成果
通讯作者Haifang Li
作者单位1.中国科学院自动化研究所
2.中国科学技术大学
推荐引用方式
GB/T 7714
Li HF,Yingce Xia,Haifang Li. Infinitely Many-Armed Bandits with Budget Constraints[C],2016.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
aaai_17_full.pdf(307KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li HF(李海芳)]的文章
[Yingce Xia]的文章
[Haifang Li]的文章
百度学术
百度学术中相似的文章
[Li HF(李海芳)]的文章
[Yingce Xia]的文章
[Haifang Li]的文章
必应学术
必应学术中相似的文章
[Li HF(李海芳)]的文章
[Yingce Xia]的文章
[Haifang Li]的文章
相关权益政策
暂无数据
收藏/分享
文件名: aaai_17_full.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。