Author: Kaifu Gao; Duc Duy Nguyen; Rui Wang; Guo-Wei Wei
Title: Machine intelligence design of 2019-nCoV drugs Document date: 2020_2_4
ID: 1qniriu0_7
Snippet: The autoencoder, consisting of an encoder, a latent space, and a decoder, is used to encode a molecular SMILES string into a latent space representation X, which, after being further modified by a molecular generator, is translated back to a SMILES string by a decoder. Both the encoder and decoder are constructed by using gated recurrent neural networks (GRUs). GRUs can deal with the vanishing gradient problem occurred in recurrent neural network.....
Document: The autoencoder, consisting of an encoder, a latent space, and a decoder, is used to encode a molecular SMILES string into a latent space representation X, which, after being further modified by a molecular generator, is translated back to a SMILES string by a decoder. Both the encoder and decoder are constructed by using gated recurrent neural networks (GRUs). GRUs can deal with the vanishing gradient problem occurred in recurrent neural network (RNN) models but are simpler than long-short-term memory (LSTM) models. GRUs are suitable for moderately complex sequences, such as small molecular SMILES strings. A pre-trained autoencoder model developed by Winter et al is adopted in the present work. 19 The latent space vector (X ∈ R n ) or molecular representation has the dimension of 512 (n = 512).
Search related documents:
Co phrase search for related documents- latent space vector and long short term memory: 1
- latent space vector and LSTM model: 1
- latent space vector and LSTM model long short term memory: 1
- latent space vector and molecular SMILES string: 1
- latent space vector and short term: 1
- latent space vector and SMILES string: 1
- latent space vector and space vector: 1
- long short term and LSTM model: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72
- long short term and LSTM model long short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65
- long short term and molecular SMILES string: 1
- long short term and neural network: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72
- long short term and present work: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
- long short term and recurrent neural network: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57
- long short term and recurrent RNN model neural network: 1, 2, 3, 4, 5, 6
- long short term and RNN model: 1, 2, 3, 4, 5, 6, 7
- short term and SMILES string: 1
- short term and space representation: 1, 2, 3
- short term and space vector: 1, 2
- SMILES string and space vector: 1
Co phrase search for related documents, hyperlinks ordered by date