PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework
Hu, Yiming1,2; Wang, Xingang1; Gu, Qingyi1
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
2022-03-22
页码14
通讯作者Gu, Qingyi(qingyi.gu@ia.ac.cn)
摘要Neural architecture search (NAS) depends heavily on an efficient and accurate performance estimator. To speed up the evaluation process, recent advances, like differentiable architecture search (DARTS) and One-Shot approaches, instead of training every model from scratch, train a weight-sharing super-network to reuse parameters among different candidates, in which all child models can be efficiently evaluated. Though these methods significantly boost search efficiency, they inherently suffer from inaccurate and unstable performance estimation. To this end, we propose a general and effective framework for powering weight-sharing NAS, namely, PWSNAS, by shrinking search space automatically, i.e., candidate operators will be discarded if they are less important. With the strategy, our approach can provide a promising search space of a smaller size by progressively simplifying the original search space, which can reduce difficulties for existing NAS methods to find superior architectures. In particular, we present two strategies to guide the shrinking process: detect redundant operators with a new angle-based metric and decrease the degree of weight sharing of a super-network by increasing parameters, which differentiates PWSNAS from existing shrinking methods. Comprehensive analysis experiments on NASBench-201 verify the superiority of our proposed metric over existing accuracy-based and magnitude-based metrics. PWSNAS can easily apply to the state-of-the-art NAS methods, e.g., single path one-shot neural architecture search (SPOS), FairNAS, ProxylessNAS, DARTS, and progressive DARTS (PDARTS). We evaluate PWSNAS and demonstrate consistent performance gains over baseline methods.
关键词Computer architecture Training Optimization Extraterrestrial measurements Estimation Computational modeling Search problems Metric neural architecture search (NAS) search space shrinking weight sharing
DOI10.1109/TNNLS.2022.3156373
收录类别SCI
语种英语
资助项目National Key Research and Development Program of China[2018YFD0400902] ; National Natural Science Foundation of China[61673376] ; Scientific Instrument Developing Project of the Chinese Academy of Science[YJKYYQ20200045]
项目资助者National Key Research and Development Program of China ; National Natural Science Foundation of China ; Scientific Instrument Developing Project of the Chinese Academy of Science
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000773231900001
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
七大方向——子方向分类机器学习
引用统计
被引频次:1[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/48161
专题中国科学院工业视觉智能装备工程实验室_精密感知与控制
中国科学院自动化研究所
通讯作者Gu, Qingyi
作者单位1.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100039, Peoples R China
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Hu, Yiming,Wang, Xingang,Gu, Qingyi. PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:14.
APA Hu, Yiming,Wang, Xingang,&Gu, Qingyi.(2022).PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,14.
MLA Hu, Yiming,et al."PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):14.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Hu, Yiming]的文章
[Wang, Xingang]的文章
[Gu, Qingyi]的文章
百度学术
百度学术中相似的文章
[Hu, Yiming]的文章
[Wang, Xingang]的文章
[Gu, Qingyi]的文章
必应学术
必应学术中相似的文章
[Hu, Yiming]的文章
[Wang, Xingang]的文章
[Gu, Qingyi]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。