Author: He, Shujun; Gao, Baizhen; Sabnis, Rushant; Sun, Qing
Title: Nucleic Transformer: Deep Learning on Nucleic Acids with Self-attention and Convolutions Cord-id: i9oqwrn1 Document date: 2021_7_26
ID: i9oqwrn1
Snippet: Much work has been done to apply machine learning and deep learning to genomics tasks, but these applications usually require extensive domain knowledge and the resulting models provide very limited interpretability. Here we present the Nucleic Transformer, a conceptually simple but effective and interpretable model architecture that excels in a variety of DNA/RNA tasks. The Nucleic Transformer processes nucleic acid sequences with self-attention and convolutions, two deep learning techniques th
Document: Much work has been done to apply machine learning and deep learning to genomics tasks, but these applications usually require extensive domain knowledge and the resulting models provide very limited interpretability. Here we present the Nucleic Transformer, a conceptually simple but effective and interpretable model architecture that excels in a variety of DNA/RNA tasks. The Nucleic Transformer processes nucleic acid sequences with self-attention and convolutions, two deep learning techniques that have proved dominant in the fields of computer vision and natural language processing. We demonstrate that the Nucleic Transformer can be trained in both supervised and unsupervised fashion without much domain knowledge to achieve high performance with limited amounts of data in Escherichia coli promoter classification, viral genome identification, and degradation properties of COVID-19 mRNA vaccine candidates. Additionally, we showcase extraction of promoter motifs from learned attention and how direct visualization of self-attention maps assists informed decision making using deep learning models.
Search related documents:
Co phrase search for related documents- Try single phrases listed below for: 1
Co phrase search for related documents, hyperlinks ordered by date