Selected article for: "batch size and learning rate"

Author: Asmaa Abbas; Mohammed Abdelsamea; Mohamed Gaber
Title: Classification of COVID-19 in chest X-ray images using DeTraC deep convolutional neural network
  • Document date: 2020_4_1
  • ID: i5jk0407_11
    Snippet: where the number of instances in A is equal to B while C encodes the new labels of 85 the subclasses (e.g. C = {l 11 , l 12 , . . . , l 1k , l 21 , l 22 , . . . , l 2k , . . . l ck }). Consequently A and 86 B can be rewritten as: For fine-tuning the parameters, the learning rate for all the CNN layers was fixed to 92 0.0001 except for the last fully connected layer (was 0.01), the min batch size was 64 93 with minimum 256 epochs, 0.001 was set fo.....
    Document: where the number of instances in A is equal to B while C encodes the new labels of 85 the subclasses (e.g. C = {l 11 , l 12 , . . . , l 1k , l 21 , l 22 , . . . , l 2k , . . . l ck }). Consequently A and 86 B can be rewritten as: For fine-tuning the parameters, the learning rate for all the CNN layers was fixed to 92 0.0001 except for the last fully connected layer (was 0.01), the min batch size was 64 93 with minimum 256 epochs, 0.001 was set for the weight decay to prevent the overfitting 94 through training the model, and the momentum value was 0.9. With the limited [y j ln z x j

    Search related documents:
    Co phrase search for related documents
    • batch size and parameter tuning: 1
    • CNN layer and fully connected layer: 1, 2, 3
    • fine parameter tuning and parameter tuning: 1, 2
    • fully connected layer and model train: 1