Selected article for: "long LSTM short term memory and LSTM short term memory"

Author: Kaifu Gao; Duc Duy Nguyen; Rui Wang; Guo-Wei Wei
Title: Machine intelligence design of 2019-nCoV drugs
  • Document date: 2020_2_4
  • ID: 1qniriu0_7
    Snippet: The autoencoder, consisting of an encoder, a latent space, and a decoder, is used to encode a molecular SMILES string into a latent space representation X, which, after being further modified by a molecular generator, is translated back to a SMILES string by a decoder. Both the encoder and decoder are constructed by using gated recurrent neural networks (GRUs). GRUs can deal with the vanishing gradient problem occurred in recurrent neural network.....
    Document: The autoencoder, consisting of an encoder, a latent space, and a decoder, is used to encode a molecular SMILES string into a latent space representation X, which, after being further modified by a molecular generator, is translated back to a SMILES string by a decoder. Both the encoder and decoder are constructed by using gated recurrent neural networks (GRUs). GRUs can deal with the vanishing gradient problem occurred in recurrent neural network (RNN) models but are simpler than long-short-term memory (LSTM) models. GRUs are suitable for moderately complex sequences, such as small molecular SMILES strings. A pre-trained autoencoder model developed by Winter et al is adopted in the present work. 19 The latent space vector (X ∈ R n ) or molecular representation has the dimension of 512 (n = 512).

    Search related documents:
    Co phrase search for related documents