CASIA OpenIR  > 学术期刊  > Machine Intelligence Research
Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey
Xiao Wang1,2; Guangyao Chen1,3; Guangwu Qian1; Pengcheng Gao1; Xiao-Yong Wei1,4; Yaowei Wang1; Yonghong Tian1,3; Wen Gao1,3
发表期刊Machine Intelligence Research
ISSN2731-538X
2023
卷号20期号:4页码:447-482
摘要

With the urgent demand for generalized deep models, many pre-trained big models are proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT), generative pre-trained transformers (GPT), etc. Inspired by the success of these models in single domains (like computer vision and natural language processing), the multi-modal pre-trained big models have also drawn more and more attention in recent years. In this work, we give a comprehensive survey of these models and hope this paper could provide new insights and helps fresh researchers to track the most cutting-edge works. Specifically, we firstly introduce the background of multi-modal pre-training by reviewing the conventional deep learning, pre-training works in natural language process, computer vision, and speech. Then, we introduce the task definition, key challenges, and advantages of multi-modal pre-training models (MM PTMs), and discuss the MM-PTMs with a focus on data, objectives, network architectures, and knowledge enhanced pre-training. After that, we introduce the downstream tasks used for the validation of large-scale MM-PTMs, including generative, classification, and regression tasks. We also give visualization and analysis of the model parameters and results on representative downstream tasks. Finally, we point out possible research directions for this topic that may benefit future works. In addition, we maintain a continuously updated paper list for large-scale pre-trained multi-modal big models: https://github.com/wangxiao5791509/MultiModal_BigModels_Survey.

关键词Multi-modal (MM), pre-trained model (PTM), information fusion, representation learning, deep learning
DOI10.1007/s11633-022-1410-8
七大方向——子方向分类其他
国重实验室规划方向分类其他
是否有论文关联数据集需要存交
中文导读https://mp.weixin.qq.com/s/yX1DdDCA-nMluzOB6Qz3sw
视频解析https://www.bilibili.com/video/BV1AC4y127eY/
引用统计
被引频次:15[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/55990
专题学术期刊_Machine Intelligence Research
作者单位1.Peng Cheng Laboratory, Shenzhen 518055, China
2.School of Computer Science and Technology, Anhui University, Hefei 230601, China
3.School of Computer Science, Peking University, Beijing 100871, China
4.College of Computer Science, Sichuan University, Chengdu 610065, China
推荐引用方式
GB/T 7714
Xiao Wang,Guangyao Chen,Guangwu Qian,et al. Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey[J]. Machine Intelligence Research,2023,20(4):447-482.
APA Xiao Wang.,Guangyao Chen.,Guangwu Qian.,Pengcheng Gao.,Xiao-Yong Wei.,...&Wen Gao.(2023).Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey.Machine Intelligence Research,20(4),447-482.
MLA Xiao Wang,et al."Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey".Machine Intelligence Research 20.4(2023):447-482.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
MIR-2022-07-224.pdf(3540KB)期刊论文出版稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Xiao Wang]的文章
[Guangyao Chen]的文章
[Guangwu Qian]的文章
百度学术
百度学术中相似的文章
[Xiao Wang]的文章
[Guangyao Chen]的文章
[Guangwu Qian]的文章
必应学术
必应学术中相似的文章
[Xiao Wang]的文章
[Guangyao Chen]的文章
[Guangwu Qian]的文章
相关权益政策
暂无数据
收藏/分享
文件名: MIR-2022-07-224.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。