Selected article for: "activation function and ReLU activation function"

Author: Raj Dandekar; George Barbastathis
Title: Quantifying the effect of quarantine control in Covid-19 infectious spread using machine learning
  • Document date: 2020_4_6
  • ID: 222c1jzv_34
    Snippet: For the implementation, we choose a n = 2-layer densely connected neural network with 10 units in the hidden layer and the ReLU activation function. This choice was because we found sigmoidal activation functions to stagnate. The final model was described by 63 tunable parameters. The neural network architecture schematic is shown in the attached Supplementary Information. The governing coupled ordinary differential equations for the augmented SI.....
    Document: For the implementation, we choose a n = 2-layer densely connected neural network with 10 units in the hidden layer and the ReLU activation function. This choice was because we found sigmoidal activation functions to stagnate. The final model was described by 63 tunable parameters. The neural network architecture schematic is shown in the attached Supplementary Information. The governing coupled ordinary differential equations for the augmented SIR model are

    Search related documents:
    Co phrase search for related documents