Selected article for: "long short term memory and lstm layer"

Author: Xiang Bai; Cong Fang; Yu Zhou; Song Bai; Zaiyi Liu; Qianlan Chen; Yongchao Xu; Tian Xia; Shi Gong; Xudong Xie; Dejia Song; Ronghui Du; Chunhua Zhou; Chengyang Chen; Dianer Nie; Dandan Tu; Changzheng Zhang; Xiaowu Liu; Lixin Qin; Weiwei Chen
Title: Predicting COVID-19 malignant progression with AI techniques
  • Document date: 2020_3_23
  • ID: 50oy9qqy_9
    Snippet: The copyright holder for this preprint (which was not peer-reviewed) is the . https://doi.org/10.1101/2020.03.20.20037325 doi: medRxiv preprint matrix into an 18-dimensional vector and concatenated it with the 22-dimensional vector to form a 40-dimensional CT feature vector. According to the checkpoints, the CT data sequence with a length of seven and a dimension of 40 was formed. For the sake of combining static and dynamic data as the input of .....
    Document: The copyright holder for this preprint (which was not peer-reviewed) is the . https://doi.org/10.1101/2020.03.20.20037325 doi: medRxiv preprint matrix into an 18-dimensional vector and concatenated it with the 22-dimensional vector to form a 40-dimensional CT feature vector. According to the checkpoints, the CT data sequence with a length of seven and a dimension of 40 was formed. For the sake of combining static and dynamic data as the input of long short term memory (LSTM), a multi-layer perceptron (MLP) was applied to the static data to obtain a 40-dimensional feature vector, which is used as the input data of the first timestamp of the LSTM, followed by the other seven CT feature vectors 18 . The LSTM model employed in this study is a single-layer network with the embedding dimension of 40 and the hidden dimension of 32. The output of the LSTM, a 32 × 8 feature sequence, was then fed into fully connected layers. A Softmax layer was added at the top of the network to output the probability of the patient conversion to the severe/critical stage.

    Search related documents:
    Co phrase search for related documents
    • feature sequence and LSTM model: 1
    • feature vector and long LSTM short term memory: 1, 2
    • feature vector and LSTM model: 1
    • hidden dimension and long LSTM short term memory: 1
    • long LSTM short term memory and LSTM model: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
    • long LSTM short term memory and LSTM output: 1, 2, 3
    • long LSTM short term memory input and LSTM model: 1