Selected article for: "linear kernel function and random forest"

Author: Guillaume Chassagnon; Maria Vakalopoulou; Enzo Battistella; Stergios Christodoulidis; Trieu-Nghi Hoang-Thi; Severine Dangeard; Eric Deutsch; Fabrice Andre; Enora Guillo; Nara Halm; Stefany El Hajj; Florian Bompard; Sophie Neveu; Chahinez Hani; Ines Saab; Alienor Campredon; Hasmik Koulakian; Souhail Bennani; Gael Freche; Aurelien Lombard; Laure Fournier; Hippolyte Monnier; Teodor Grand; Jules Gregory; Antoine Khalil; Elyas Mahdjoub; Pierre-Yves Brillet; Stephane Tran Ba; Valerie Bousson; Marie-Pierre Revel; Nikos Paragios
Title: AI-Driven CT-based quantification, staging and short-term outcome prediction of COVID-19 pneumonia
  • Document date: 2020_4_22
  • ID: nxm1jr0x_28
    Snippet: Subsequently, this reduced feature space was considered to be most appropriate for training, and the following 7 classification methods with acceptable performance, > 60% in terms of balanced accuracy, as well as coherent performance between training and validation, performance decrease < 20% for the balanced accuracy between training and validation, were trained and combined together through a winner takes all approach to determine the optimal o.....
    Document: Subsequently, this reduced feature space was considered to be most appropriate for training, and the following 7 classification methods with acceptable performance, > 60% in terms of balanced accuracy, as well as coherent performance between training and validation, performance decrease < 20% for the balanced accuracy between training and validation, were trained and combined together through a winner takes all approach to determine the optimal outcome ( Table 4 ). The final selected methods include the {Linear, Polynomial Kernel, Radial Basis Function} Support Vector Machines, Decision Trees, Random Forests, AdaBoost, and Gaussian Naive Bayes which were trained and combined together through a winner takes all approach to determine the optimal outcome. To overcome the unbalance of the different classes, each class received a weight inversely proportional to its size. The Support Vector Machines were all three granted a polynomial kernel function of degree 3 and a penalty parameter of 0.25. In addition, the one with a Radial Basis Function kernel was granted a kernel coefficient of 3. The decision tree classifier was limited to a depth of 3 to avoid overfitting. The random forest classifier was composed of 8 of such trees. AdaBoost classifier was based on a decision tree of maximal depth of 2 boosted three times.

    Search related documents:
    Co phrase search for related documents
    • balanced accuracy and classification method: 1
    • balanced accuracy and decision tree: 1
    • balanced accuracy and forest classifier: 1
    • classification method and decision tree: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
    • classification method and feature space: 1
    • classification method and forest classifier: 1, 2
    • classification method and kernel function: 1
    • decision tree and feature space: 1, 2
    • decision tree and forest classifier: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
    • decision tree classifier and forest classifier: 1, 2, 3, 4, 5, 6, 7, 8, 9
    • forest classifier and kernel function: 1