Selected article for: "input layer and LSTM layer"

Author: Liu, Yuliang; Zhang, Quan; Zhao, Geng; Liu, Guohua; Liu, Zhiang
Title: Deep Learning-Based Method of Diagnosing Hyperlipidemia and Providing Diagnostic Markers Automatically
  • Document date: 2020_3_11
  • ID: 1r4gm2d4_14
    Snippet: iwhere Attentionis attention vector, Fis the encoding function of the original data, xis raw data, Iis the input data of the LSTM layer. The possibility that the target output is related to each input physiological parameter is obtained by coding process F. Then, the output of the coding process is normalized by Softmax to obtain the attention distribution probability value that conforms to the probability distribution value range. The use of att.....
    Document: iwhere Attentionis attention vector, Fis the encoding function of the original data, xis raw data, Iis the input data of the LSTM layer. The possibility that the target output is related to each input physiological parameter is obtained by coding process F. Then, the output of the coding process is normalized by Softmax to obtain the attention distribution probability value that conforms to the probability distribution value range. The use of attention mechanisms provides more information on which physiological parameters are more important for the diagnosis of the target disease. At the same time, the attention mechanism helps the model to process effective information and discards useless data, improving the model's ability to process more complex information. The Softmax function is shown in Equation 5.

    Search related documents:
    Co phrase search for related documents
    • attention mechanism and target output: 1
    • complex information and target disease: 1
    • complex information and target disease diagnosis: 1
    • effective information and target disease: 1