
Research Article
Self-gated FM: Revisiting the Weight of Feature Interactions for CTR Prediction
@INPROCEEDINGS{10.1007/978-3-031-24383-7_23, author={Zhongxue Li and Hao Wu and Xin Wang and Yiji Zhao and Lei Zhang}, title={Self-gated FM: Revisiting the Weight of Feature Interactions for CTR Prediction}, proceedings={Collaborative Computing: Networking, Applications and Worksharing. 18th EAI International Conference, CollaborateCom 2022, Hangzhou, China, October 15-16, 2022, Proceedings, Part I}, proceedings_a={COLLABORATECOM}, year={2023}, month={1}, keywords={Factorization machines Feature selection Self-gating Neural networks Click-through rate prediction}, doi={10.1007/978-3-031-24383-7_23} }
- Zhongxue Li
Hao Wu
Xin Wang
Yiji Zhao
Lei Zhang
Year: 2023
Self-gated FM: Revisiting the Weight of Feature Interactions for CTR Prediction
COLLABORATECOM
Springer
DOI: 10.1007/978-3-031-24383-7_23
Abstract
With the successful application of factorization machine models in click-through rate prediction, the automatic selection of feature interactions has attracted extensive attention. As the most commonly-used strategies, automatic construction of limited high-order cross-feature and automatic learning of feature interaction weight have made great progress. However, most studies still face challenges of complex training and search process. Therefore, we propose a self-gating mechanism for automatic feature selection of factorization machine models and implement a portable self-gating layer. In the self-gating layer, the weight of feature interaction is revisited through the attention network with different attentive aspects, and then the gate status is dynamically determined according to the attention score to achieve the effect of automatic selection. Our method can be easily ported to FM and DeepFM. The experimental results on two real-world data sets show that our proposed methods are superior to many state-of-the-art methods.