Knowledge Commons of Institute of Automation,CAS
Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey | |
Xiao Wang1,2![]() | |
发表期刊 | Machine Intelligence Research
![]() |
ISSN | 2731-538X |
2023 | |
卷号 | 20期号:4页码:447-482 |
摘要 | With the urgent demand for generalized deep models, many pre-trained big models are proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT), generative pre-trained transformers (GPT), etc. Inspired by the success of these models in single domains (like computer vision and natural language processing), the multi-modal pre-trained big models have also drawn more and more attention in recent years. In this work, we give a comprehensive survey of these models and hope this paper could provide new insights and helps fresh researchers to track the most cutting-edge works. Specifically, we firstly introduce the background of multi-modal pre-training by reviewing the conventional deep learning, pre-training works in natural language process, computer vision, and speech. Then, we introduce the task definition, key challenges, and advantages of multi-modal pre-training models (MM PTMs), and discuss the MM-PTMs with a focus on data, objectives, network architectures, and knowledge enhanced pre-training. After that, we introduce the downstream tasks used for the validation of large-scale MM-PTMs, including generative, classification, and regression tasks. We also give visualization and analysis of the model parameters and results on representative downstream tasks. Finally, we point out possible research directions for this topic that may benefit future works. In addition, we maintain a continuously updated paper list for large-scale pre-trained multi-modal big models: https://github.com/wangxiao5791509/MultiModal_BigModels_Survey. |
关键词 | Multi-modal (MM), pre-trained model (PTM), information fusion, representation learning, deep learning |
DOI | 10.1007/s11633-022-1410-8 |
七大方向——子方向分类 | 其他 |
国重实验室规划方向分类 | 其他 |
是否有论文关联数据集需要存交 | 否 |
中文导读 | https://mp.weixin.qq.com/s/yX1DdDCA-nMluzOB6Qz3sw |
视频解析 | https://www.bilibili.com/video/BV1AC4y127eY/ |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/55990 |
专题 | 学术期刊_Machine Intelligence Research |
作者单位 | 1.Peng Cheng Laboratory, Shenzhen 518055, China 2.School of Computer Science and Technology, Anhui University, Hefei 230601, China 3.School of Computer Science, Peking University, Beijing 100871, China 4.College of Computer Science, Sichuan University, Chengdu 610065, China |
推荐引用方式 GB/T 7714 | Xiao Wang,Guangyao Chen,Guangwu Qian,et al. Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey[J]. Machine Intelligence Research,2023,20(4):447-482. |
APA | Xiao Wang.,Guangyao Chen.,Guangwu Qian.,Pengcheng Gao.,Xiao-Yong Wei.,...&Wen Gao.(2023).Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey.Machine Intelligence Research,20(4),447-482. |
MLA | Xiao Wang,et al."Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey".Machine Intelligence Research 20.4(2023):447-482. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
MIR-2022-07-224.pdf(3540KB) | 期刊论文 | 出版稿 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论