CASIA OpenIR

浏览/检索结果: 共6条,第1-6条 帮助

限定条件                
已选(0)清除 条数/页:   排序方式:
PmcaNet: Pyramid multiscale channel attention network for electron microscopy image segmentation 期刊论文
Journal of Intelligent & Fuzzy Systems, 2024, 卷号: 46, 期号: 2, 页码: 4895-4907
作者:  Gao, Kaihan;  Ju, Yiwei;  Li, Shuai;  Yang, Xuebing;  Zhang, Wensheng;  Li, Guoqing
Adobe PDF(1371Kb)  |  收藏  |  浏览/下载:20/4  |  提交时间:2024/05/28
Electron microscopy  Image segmentation  Convolutional neural network  Multiscale feature pyramid  
Momentum Acceleration in the Individual Convergence of Nonsmooth Convex Optimization With Constraints 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 卷号: 33, 期号: 3, 页码: 1107-1118
作者:  Tao, Wei;  Wu, Gao-Wei;  Tao, Qing
收藏  |  浏览/下载:202/0  |  提交时间:2022/06/10
Heavy-ball (HB) methods  individual convergence  machine learning  momentum methods  nonsmooth optimization  sparsity  
Classifying the tracing difficulty of 3D neuron image blocks based on deep learning 期刊论文
Brain Informatics, 2021, 卷号: 8, 期号: 1
作者:  Yang,Bin;  Huang,Jiajin;  Wu,Gaowei;  Yang,Jian
收藏  |  浏览/下载:180/0  |  提交时间:2021/12/28
Deep learning  Tracing difficulty classification  Residual neural network  Fully connected neural network  Long short-term memory network  
Exploring highly reliable substructures in auto-reconstructions of a neuron 期刊论文
Brain Informatics, 2021, 卷号: 8, 期号: 1
作者:  He,Yishan;  Huang,Jiajin;  Wu,Gaowei;  Yang,Jian
收藏  |  浏览/下载:154/0  |  提交时间:2021/11/03
Neuronal morphology  Reconstruction  Local alignment  Motif  
The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 31, 期号: 7, 页码: 2557-2568
作者:  Tao, Wei;  Pan, Zhisong;  Wu, Gaowei;  Tao, Qing
收藏  |  浏览/下载:239/0  |  提交时间:2020/08/03
Convergence  Extrapolation  Optimization  Acceleration  Machine learning  Task analysis  Machine learning algorithms  Individual convergence  machine learning  Nesterov's extrapolation  nonsmooth optimization  sparsity  
Primal Averaging: A New Gradient Evaluation Step to Attain the Optimal Individual Convergence 期刊论文
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 卷号: 50, 期号: 2, 页码: 835-845
作者:  Tao, Wei;  Pan, Zhisong;  Wu, Gaowei;  Tao, Qing
收藏  |  浏览/下载:210/0  |  提交时间:2020/03/30
Convergence  Convex functions  Machine learning  Optimization methods  Linear programming  Cybernetics  Individual convergence  machine learning  mirror descent (MD) methods  regularized learning problems  stochastic gradient descent (SGD)  stochastic optimization