EAT-NAS: elastic architecture transfer for accelerating large-scale neural architecture search
Fang, Jiemin1,2; Chen, Yukang4; Zhang, Xinbang4; Zhang, Qian3; Huang, Chang3; Meng, Gaofeng4; Liu, Wenyu2; Wang, Xinggang2
发表期刊SCIENCE CHINA-INFORMATION SCIENCES
ISSN1674-733X
2021-09-01
卷号64期号:9页码:13
摘要

Neural architecture search (NAS) methods have been proposed to relieve human experts from tedious architecture engineering. However, most current methods are constrained in small-scale search owing to the issue of huge computational resource consumption. Meanwhile, the direct application of architectures searched on small datasets to large datasets often bears no performance guarantee due to the discrepancy between different datasets. This limitation impedes the wide use of NAS on large-scale tasks. To overcome this obstacle, we propose an elastic architecture transfer mechanism for accelerating large-scale NAS (EAT-NAS). In our implementations, the architectures are first searched on a small dataset, e.g., CIFAR-10. The best one is chosen as the basic architecture. The search process on a large dataset, e.g., ImageNet, is initialized with the basic architecture as the seed. The large-scale search process is accelerated with the help of the basic architecture. We propose not only a NAS method but also a mechanism for architecture-level transfer learning. In our experiments, we obtain two final models EATNet-A and EATNet-B, which achieve competitive accuracies of 75.5% and 75.6%, respectively, on ImageNet. Both the models also surpass the models searched from scratch on ImageNet under the same settings. For the computational cost, EAT-NAS takes only fewer than 5 days using 8 TITAN X GPUs, which is significantly less than the computational consumption of the state-of-the-art large-scale NAS methods.

关键词architecture transfer neural architecture search evolutionary algorithm large-scale dataset
DOI10.1007/s11432-020-3112-8
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China (NSFC)[61876212] ; National Natural Science Foundation of China (NSFC)[61976208] ; National Natural Science Foundation of China (NSFC)[61733007] ; Zhejiang Lab[2019NB0AB02] ; HUST-Horizon Computer Vision Research Center
项目资助者National Natural Science Foundation of China (NSFC) ; Zhejiang Lab ; HUST-Horizon Computer Vision Research Center
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Information Systems ; Engineering, Electrical & Electronic
WOS记录号WOS:000685212100001
出版者SCIENCE PRESS
引用统计
被引频次:6[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/45689
专题多模态人工智能系统全国重点实验室_先进时空数据分析与学习
通讯作者Wang, Xinggang
作者单位1.Huazhong Univ Sci & Technol, Inst Artificial Intelligence, Wuhan 430074, Peoples R China
2.Huazhong Univ Sci & Technol, Sch Elect Informat & Commun, Wuhan 430074, Peoples R China
3.Horizon Robot, Beijing 100089, Peoples R China
4.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Fang, Jiemin,Chen, Yukang,Zhang, Xinbang,et al. EAT-NAS: elastic architecture transfer for accelerating large-scale neural architecture search[J]. SCIENCE CHINA-INFORMATION SCIENCES,2021,64(9):13.
APA Fang, Jiemin.,Chen, Yukang.,Zhang, Xinbang.,Zhang, Qian.,Huang, Chang.,...&Wang, Xinggang.(2021).EAT-NAS: elastic architecture transfer for accelerating large-scale neural architecture search.SCIENCE CHINA-INFORMATION SCIENCES,64(9),13.
MLA Fang, Jiemin,et al."EAT-NAS: elastic architecture transfer for accelerating large-scale neural architecture search".SCIENCE CHINA-INFORMATION SCIENCES 64.9(2021):13.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Fang2021_Article_EAT(377KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Fang, Jiemin]的文章
[Chen, Yukang]的文章
[Zhang, Xinbang]的文章
百度学术
百度学术中相似的文章
[Fang, Jiemin]的文章
[Chen, Yukang]的文章
[Zhang, Xinbang]的文章
必应学术
必应学术中相似的文章
[Fang, Jiemin]的文章
[Chen, Yukang]的文章
[Zhang, Xinbang]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Fang2021_Article_EAT-NASElasticArchitectureTran.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。