CASIA OpenIR  > 模式识别实验室
TFNet: Multi-Semantic Feature Interaction for CTR Prediction
Shu Wu1; Feng Yu1; Xueli Yu1; Qiang Liu1; Liang Wang1; Tieniu Tan1; Jie Shao2; Fan Huang2
2020-07-25
会议名称Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval
会议日期2020/07/25-30
会议地点Virtual Event, China
出版者SIGIR ’20
摘要

The CTR (Click-Through Rate) prediction plays a central role in the domain of computational advertising and recommender systems. There exists several kinds of methods proposed in this field, such as Logistic Regression (LR), Factorization Machines (FM) and deep learning based methods like Wide&Deep, Neural Factorization Machines (NFM) and DeepFM. However, such approaches generally use the vector-product of each pair of features, which have ignored the different semantic spaces of the feature interactions. In this paper, we propose a novel Tensor-based Feature interaction Network (TFNet) model, which introduces an operating tensor to elaborate feature interactions via multi-slice matrices in multiple semantic spaces. Extensive offline and online experiments show that TFNet: 1) outperforms the competitive compared methods on the typical Criteo and Avazu datasets; 2) achieves large improvement of revenue and click rate in online A/B tests in the largest Chinese App recommender system, Tencent MyApp.

七大方向——子方向分类推荐系统
国重实验室规划方向分类智能计算与学习
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/57495
专题模式识别实验室
作者单位1.中国科学院自动化研究所
2.Tencent
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Shu Wu,Feng Yu,Xueli Yu,et al. TFNet: Multi-Semantic Feature Interaction for CTR Prediction[C]:SIGIR ’20,2020.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
3397271.3401304.pdf(1040KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Shu Wu]的文章
[Feng Yu]的文章
[Xueli Yu]的文章
百度学术
百度学术中相似的文章
[Shu Wu]的文章
[Feng Yu]的文章
[Xueli Yu]的文章
必应学术
必应学术中相似的文章
[Shu Wu]的文章
[Feng Yu]的文章
[Xueli Yu]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 3397271.3401304.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。