Selected article for: "common distribution and structure prediction"

Author: Hu, Fenyu; Wang, Liping; Wu, Shu; Wang, Liang; Tan, Tieniu
Title: Graph Classification by Mixture of Diverse Experts
  • Cord-id: 6z0zf9es
  • Document date: 2021_3_29
  • ID: 6z0zf9es
    Snippet: Graph classification is a challenging research problem in many applications across a broad range of domains. In these applications, it is very common that class distribution is imbalanced. Recently, Graph Neural Network (GNN) models have achieved superior performance on various real-world datasets. Despite their success, most of current GNN models largely overlook the important setting of imbalanced class distribution, which typically results in prediction bias towards majority classes. To allev
    Document: Graph classification is a challenging research problem in many applications across a broad range of domains. In these applications, it is very common that class distribution is imbalanced. Recently, Graph Neural Network (GNN) models have achieved superior performance on various real-world datasets. Despite their success, most of current GNN models largely overlook the important setting of imbalanced class distribution, which typically results in prediction bias towards majority classes. To alleviate the prediction bias, we propose to leverage semantic structure of dataset based on the distribution of node embedding. Specifically, we present GraphDIVE, a general framework leveraging mixture of diverse experts (i.e., graph classifiers) for imbalanced graph classification. With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets. Then each expert network is trained based on its corresponding subset. Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.

    Search related documents:
    Co phrase search for related documents
    • accurate prediction and loss function: 1, 2
    • accurate prediction and low standard deviation: 1