Author: Guillaume Chassagnon; Maria Vakalopoulou; Enzo Battistella; Stergios Christodoulidis; Trieu-Nghi Hoang-Thi; Severine Dangeard; Eric Deutsch; Fabrice Andre; Enora Guillo; Nara Halm; Stefany El Hajj; Florian Bompard; Sophie Neveu; Chahinez Hani; Ines Saab; Alienor Campredon; Hasmik Koulakian; Souhail Bennani; Gael Freche; Aurelien Lombard; Laure Fournier; Hippolyte Monnier; Teodor Grand; Jules Gregory; Antoine Khalil; Elyas Mahdjoub; Pierre-Yves Brillet; Stephane Tran Ba; Valerie Bousson; Marie-Pierre Revel; Nikos Paragios
Title: AI-Driven CT-based quantification, staging and short-term outcome prediction of COVID-19 pneumonia Document date: 2020_4_22
ID: nxm1jr0x_22
Snippet: Regarding implementation details, 6 templates were used for the AtlasNet framework together with normalized cross correlation and mutual information as similarities metrics. The networks were trained using weighted cross entropy loss using weights depending on the appearance of each class and dice loss. Moreover, the 3D network was trained using a dice loss. The Dice loss (DL) and weighted cross entropy (WCE) are defined as follows,.....
Document: Regarding implementation details, 6 templates were used for the AtlasNet framework together with normalized cross correlation and mutual information as similarities metrics. The networks were trained using weighted cross entropy loss using weights depending on the appearance of each class and dice loss. Moreover, the 3D network was trained using a dice loss. The Dice loss (DL) and weighted cross entropy (WCE) are defined as follows,
Search related documents:
Co phrase search for related documents- AtlasNet framework and dice loss: 1
- AtlasNet framework and normalized cross correlation: 1
- cross correlation and dice loss: 1
- cross correlation and mutual information: 1
- cross correlation and mutual information normalized cross correlation: 1
- cross correlation and normalized cross correlation: 1, 2, 3
- cross weight entropy loss and entropy loss: 1, 2
- cross weight entropy loss and weight entropy loss: 1, 2
- dice loss and entropy loss: 1
- dice loss and normalized cross correlation: 1
- entropy loss and weight entropy loss: 1, 2
- mutual information and normalized cross correlation: 1
- mutual information normalized cross correlation and normalized cross correlation: 1
Co phrase search for related documents, hyperlinks ordered by date