Selected article for: "hidden layer and neural network"

Author: Elgharabawy, Ayman; Prasad, Mukesh; Lin, Chin-Teng
Title: Subgroup Preference Neural Network
  • Cord-id: h4hjx6hh
  • Document date: 2021_9_12
  • ID: h4hjx6hh
    Snippet: Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (SGPNN) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (ANN) to discover the hidden relation between the subgroups’ multi-labels. The SGPNN is a feedforward (FF), partially connected network that has a single middle lay
    Document: Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (SGPNN) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (ANN) to discover the hidden relation between the subgroups’ multi-labels. The SGPNN is a feedforward (FF), partially connected network that has a single middle layer and uses stairstep (SS) multi-valued activation function to enhance the prediction’s probability and accelerate the ranking convergence. The novel structure of the proposed SGPNN consists of a multi-activation function neuron (MAFN) in the middle layer to rank each subgroup independently. The SGPNN uses gradient ascent to maximize the Spearman ranking correlation between the groups of labels. Each label is represented by an output neuron that has a single SS function. The proposed SGPNN using conjoint dataset outperforms the other label ranking methods which uses each dataset individually. The proposed SGPNN achieves an average accuracy of 91.4% using the conjoint dataset compared to supervised clustering, decision tree, multilayer perceptron label ranking and label ranking forests that achieve an average accuracy of 60%, 84.8%, 69.2% and 73%, respectively, using the individual dataset.

    Search related documents:
    Co phrase search for related documents
    • accuracy enhance and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9
    • activation function and machine learning: 1, 2, 3, 4, 5, 6
    • local approach and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9
    • low squared error and machine learning: 1
    • lr learning and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21
    • lr learning rate and machine learning: 1