Selected article for: "art state and prediction performance"

Author: Xie, Zhifeng; Zhang, Wenling; Ding, Huiming; Ma, Lizhuang
Title: MsFcNET: Multi-scale Feature-Crossing Attention Network for Multi-field Sparse Data
  • Cord-id: lt2nxbzm
  • Document date: 2020_4_17
  • ID: lt2nxbzm
    Snippet: Feature engineering usually needs to excavate dense-and-implicit cross features from multi-filed sparse data. Recently, many state-of-the-art models have been proposed to achieve low-order and high-order feature interactions. However, most of them ignore the importance of cross features and fail to suppress the negative impact of useless features. In this paper, a novel multi-scale feature-crossing attention network (MsFcNET) is proposed to extract dense-and-implicit cross features and learn the
    Document: Feature engineering usually needs to excavate dense-and-implicit cross features from multi-filed sparse data. Recently, many state-of-the-art models have been proposed to achieve low-order and high-order feature interactions. However, most of them ignore the importance of cross features and fail to suppress the negative impact of useless features. In this paper, a novel multi-scale feature-crossing attention network (MsFcNET) is proposed to extract dense-and-implicit cross features and learn their importance in the different scales. The model adopts the DIA-LSTM units to construct a new attention calibration architecture, which can adaptively adjust the weights of features in the process of feature interactions. On the other hand, it also integrates a multi-scale feature-crossing module to strengthen the representation ability of cross features from multi-field sparse data. The extensive experimental results on three real-world prediction datasets demonstrate that our proposed model yields superior performance compared with the other state-of-the-art models.

    Search related documents:
    Co phrase search for related documents
    • activation function and machine learning: 1, 2, 3, 4, 5, 6