Selected article for: "kappa value and moderate agreement"

Author: Bagnera, Silvia; Bisanti, Francesca; Tibaldi, Claudia; Pasquino, Massimo; Berrino, Giulia; Ferraro, Roberta; Patania, Sebastiano
Title: Performance of Radiologists in the Evaluation of the Chest Radiography with the Use of a “new software score” in Coronavirus Disease 2019 Pneumonia Suspected Patients
  • Cord-id: t85wd0xq
  • Document date: 2020_7_20
  • ID: t85wd0xq
    Snippet: OBJECTIVES: The purpose of this study is to assess the performance of radiologists using a new software called “COVID-19 score” when performing chest radiography on patients potentially infected by coronavirus disease 2019 (COVID-19) pneumonia. Chest radiography (or chest X-ray, CXR) and CT are important for the imaging diagnosis of the coronavirus pneumonia (COVID-19). CXR mobile devices are efficient during epidemies, because allow to reduce the risk of contagion and are easy to sanitize.
    Document: OBJECTIVES: The purpose of this study is to assess the performance of radiologists using a new software called “COVID-19 score” when performing chest radiography on patients potentially infected by coronavirus disease 2019 (COVID-19) pneumonia. Chest radiography (or chest X-ray, CXR) and CT are important for the imaging diagnosis of the coronavirus pneumonia (COVID-19). CXR mobile devices are efficient during epidemies, because allow to reduce the risk of contagion and are easy to sanitize. MATERIAL AND METHODS: From February–April 2020, 14 radiologists retrospectively evaluated a pool of 312 chest X-ray exams to test a new software function for lung imaging analysis based on radiological features and graded on a three-point scale. This tool automatically generates a cumulative score (0–18). The intra- rater agreement (evaluated with Fleiss’s method) and the average time for the compilation of the banner were calculated. RESULTS: Fourteen radiologists evaluated 312 chest radiographs of COVID-19 pneumonia suspected patients (80 males and 38 females) with an average age of 64, 47 years. The inter-rater agreement showed a Fleiss’ kappa value of 0.53 and the intra-group agreement varied from Fleiss’ Kappa value between 0.49 and 0.59, indicating a moderate agreement (considering as “moderate” ranges 0.4–0.6). The years of work experience were irrelevant. The average time for obtaining the result with the automatic software was between 7 s (e.g., zero COVID-19 score) and 21 s (e.g., with COVID-19 score from 6 to 12). CONCLUSION: The use of automatic software for the generation of a CXR “COVID-19 score” has proven to be simple, fast, and replicable. Implementing this tool with scores weighed on the number of lung pathological areas, a useful parameter for clinical monitoring could be available.

    Search related documents:
    Co phrase search for related documents
    • accurate rapid and low specificity: 1, 2, 3, 4, 5, 6
    • accurate rapid and lung consolidation: 1, 2, 3, 4
    • accurate rapid diagnosis and low medium: 1
    • accurate rapid diagnosis and low specificity: 1
    • accurate rapid diagnosis and lung consolidation: 1
    • accurate rapid diagnosis critical and lung consolidation: 1
    • lobar pneumonia and lung consolidation: 1
    • low specificity and lung consolidation: 1