Selected article for: "adam optimize and loss function"

Author: Liu, Yuliang; Zhang, Quan; Zhao, Geng; Liu, Guohua; Liu, Zhiang
Title: Deep Learning-Based Method of Diagnosing Hyperlipidemia and Providing Diagnostic Markers Automatically
  • Document date: 2020_3_11
  • ID: 1r4gm2d4_31
    Snippet: During the training process, the size of mini-batch was 20, the loss function was the cross-entropy cost function, and Adam algorithm was used to optimize the global parameters (ε=0.001,p 1 =0.9,p 2 =0.999, δ=10 À8 ). At the same time, one-hot technology was also applied to the representation of data labels. Each dimension of the output vector represents a different health condition, only the corresponding element is 1 and the rest is 0. Becau.....
    Document: During the training process, the size of mini-batch was 20, the loss function was the cross-entropy cost function, and Adam algorithm was used to optimize the global parameters (ε=0.001,p 1 =0.9,p 2 =0.999, δ=10 À8 ). At the same time, one-hot technology was also applied to the representation of data labels. Each dimension of the output vector represents a different health condition, only the corresponding element is 1 and the rest is 0. Because this paper distinguished two kinds of health conditions, the two-dimensional vector was used to code the data label, the normal diagnosis result was coded to 10, and the diagnosis result of hyperlipidemia was coded to 01. Onehot technology is helpful to improve the robustness of the model. At the same time, the sigmoid function was used in the classification function, because of the binary classification task. As mentioned above, the cross-entropy was used as loss function, the principle of cross-entropy is shown in Equation 8

    Search related documents:
    Co phrase search for related documents
    • Adam algorithm and classification function: 1
    • Adam algorithm and classification task: 1
    • Adam algorithm and global parameter: 1
    • binary classification task and classification function: 1
    • binary classification task and classification task: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
    • binary classification task and cross entropy loss function: 1, 2
    • binary classification task and global parameter: 1
    • classification function and cost function: 1
    • classification function and cross entropy loss function: 1
    • classification function and global parameter: 1
    • classification task and cross entropy loss function: 1, 2
    • classification task and data label: 1, 2
    • classification task and global parameter: 1
    • classification task and health condition: 1
    • cost function and dimensional vector: 1
    • diagnosis result and health condition: 1
    • different health condition and health condition: 1, 2, 3, 4, 5, 6, 7