BNAS: Efficient Neural Architecture Search Using Broad Scalable Architecture
Ding ZX(丁子祥)1,2; Yaran, Chen1,2; Nannan, Li1,2; Dingbin, Zhao1,2; Zhiquan, Sun3; C. L. Philip Chen4,5
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
2020-03
期号0页码:0
摘要

Efficient neural architecture search (ENAS)
achieves novel efficiency for learning architecture with
high-performance via parameter sharing and reinforcement
learning (RL). In the phase of architecture search, ENAS employs
deep scalable architecture as search space whose training process
consumes most of the search cost. Moreover, time-consuming
model training is proportional to the depth of deep scalable
architecture. Through experiments using ENAS on CIFAR-10,
we find that layer reduction of scalable architecture is an effective
way to accelerate the search process of ENAS but suffers from
a prohibitive performance drop in the phase of architecture
estimation. In this article, we propose a broad neural architecture
search (BNAS) where we elaborately design broad scalable architecture
dubbed broad convolutional neural network (BCNN) to
solve the above issue. On the one hand, the proposed broad scalable
architecture has fast training speed due to its shallow topology.
Moreover, we also adopt RL and parameter sharing used in
ENAS as the optimization strategy of BNAS. Hence, the proposed
approach can achieve higher search efficiency. On the other hand,
the broad scalable architecture extracts multi-scale features and
enhancement representations, and feeds them into global average
pooling (GAP) layer to yield more reasonable and comprehensive
representations. Therefore, the performance of broad scalable
architecture can be promised. In particular, we also develop
two variants for BNAS that modify the topology of BCNN.
In order to verify the effectiveness of BNAS, several experiments
are performed and experimental results show that 1) BNAS
delivers 0.19 days which is 2.37× less expensive than ENAS who
ranks the best in RL-based NAS approaches; 2) compared with
small-size (0.5 million parameters) and medium-size (1.1 million
parameters) models, the architecture learned by BNAS obtains
state-of-the-art performance (3.58% and 3.24% test error) on
CIFAR-10; and 3) the learned architecture achieves 25.3%
top-1 error on ImageNet just using 3.9 million parameters.

关键词Broad convolutional neural network (BCNN), image classification, neural architecture search (NAS), reinforcement learning (RL)
收录类别SCI
语种英语
WOS记录号WOS:000732345100001
七大方向——子方向分类强化与进化学习
引用统计
被引频次:32[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/46583
专题多模态人工智能系统全国重点实验室_深度强化学习
通讯作者Dingbin, Zhao
作者单位1.the State Key Laboratory of Management and Control for Complex Systems,Institute of Automation, Chinese Academy of Sciences
2.the School of Artificial Intelligence, University of Chinese Academy of Sciences
3.the School of Automation and Electric Engineering, University of Science and Technology Beijing
4.the School of Computer Science & Engineering, South China University of Technology
5.the College of Navigation, Dalian Maritime University
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Ding ZX,Yaran, Chen,Nannan, Li,et al. BNAS: Efficient Neural Architecture Search Using Broad Scalable Architecture[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2020(0):0.
APA Ding ZX,Yaran, Chen,Nannan, Li,Dingbin, Zhao,Zhiquan, Sun,&C. L. Philip Chen.(2020).BNAS: Efficient Neural Architecture Search Using Broad Scalable Architecture.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS(0),0.
MLA Ding ZX,et al."BNAS: Efficient Neural Architecture Search Using Broad Scalable Architecture".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS .0(2020):0.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
BNAS.pdf(2713KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Ding ZX(丁子祥)]的文章
[Yaran, Chen]的文章
[Nannan, Li]的文章
百度学术
百度学术中相似的文章
[Ding ZX(丁子祥)]的文章
[Yaran, Chen]的文章
[Nannan, Li]的文章
必应学术
必应学术中相似的文章
[Ding ZX(丁子祥)]的文章
[Yaran, Chen]的文章
[Nannan, Li]的文章
相关权益政策
暂无数据
收藏/分享
文件名: BNAS.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。