Selected article for: "attention layer and automatic classification"

Author: Özdemir, Özgür; Sönmez, Elena Battini
Title: Attention Mechanism and Mixup Data Augmentation for Classification of COVID-19 Computed Tomography Images
  • Cord-id: 8yeq228d
  • Document date: 2021_7_15
  • ID: 8yeq228d
    Snippet: The Coronavirus disease is quickly spreading all over the world and the emergency situation is still out of control. Latest achievements of deep learning algorithms suggest the use of deep Convolutional Neural Network to implement a computer-aided diagnostic system for automatic classification of COVID-19 CT images. In this paper, we propose to employ a feature-wise attention layer in order to enhance the discriminative features obtained by convolutional networks. Moreover, the original performa
    Document: The Coronavirus disease is quickly spreading all over the world and the emergency situation is still out of control. Latest achievements of deep learning algorithms suggest the use of deep Convolutional Neural Network to implement a computer-aided diagnostic system for automatic classification of COVID-19 CT images. In this paper, we propose to employ a feature-wise attention layer in order to enhance the discriminative features obtained by convolutional networks. Moreover, the original performance of the network has been improved using the mixup data augmentation technique. This work compares the proposed attention-based model against the stacked attention networks, and traditional versus mixup data augmentation approaches. We deduced that feature-wise attention extension, while outperforming the stacked attention variants, achieves remarkable improvements over the baseline convolutional neural networks. That is, ResNet50 architecture extended with a feature-wise attention layer obtained 95.57% accuracy score, which, to best of our knowledge, fixes the state-of-the-art in the challenging COVID-CT dataset.

    Search related documents:
    Co phrase search for related documents
    • accuracy auc f1 and long short: 1, 2
    • accuracy auc f1 and long short term memory: 1, 2
    • accuracy performance and adam optimizer: 1, 2
    • accuracy performance and long short: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
    • accuracy performance and long short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
    • accuracy performance and low contrast: 1
    • accuracy score and adam optimizer: 1, 2, 3, 4
    • accuracy score and long short: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
    • accuracy score and long short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11
    • accuracy score and low contrast: 1, 2
    • accurate early and long short: 1, 2, 3, 4
    • accurate early and long short term memory: 1, 2, 3
    • adam optimizer and long short: 1, 2, 3
    • adam optimizer and long short term memory: 1, 2, 3