Author: Aminian, Manuchehr; Kehoe, Eric; Ma, Xiaofeng; Peterson, Amy; Kirby, Michael
Title: Exploring Musical Structure Using Tonnetz Lattice Geometry and LSTMs Cord-id: tcn30d4w Document date: 2020_6_15
ID: tcn30d4w
Snippet: We study the use of Long Short-Term Memory neural networks to the modeling and prediction of music. Approaches to applying machine learning in modeling and prediction of music often apply little, if any, music theory as part of their algorithms. In contrast, we propose an approach which employs minimal music theory to embed the relationships between notes and chord structure explicitly. We extend the Tonnetz lattice, originally developed by Euler to introduce a metric between notes, in order to
Document: We study the use of Long Short-Term Memory neural networks to the modeling and prediction of music. Approaches to applying machine learning in modeling and prediction of music often apply little, if any, music theory as part of their algorithms. In contrast, we propose an approach which employs minimal music theory to embed the relationships between notes and chord structure explicitly. We extend the Tonnetz lattice, originally developed by Euler to introduce a metric between notes, in order to induce a metric between chords. Multidimensional scaling is employed to embed chords in twenty dimensions while best preserving this music-theoretic metric. We then demonstrate the utility of this embedding in the prediction of the next chord in a musical piece, having observed a short sequence of previous chords. Applying a standard training, test, and validation methodology to a dataset of Bach chorales, we achieve an accuracy rate of 50.4% on validation data, compared to an expected rate of 0.2% when guessing the chord randomly. This suggests that using Euler’s Tonnetz for embedding provides a framework in which machine learning tools can excel in classification and prediction tasks with musical data.
Search related documents:
Co phrase search for related documents- loss function and lstm model: 1, 2, 3, 4
- loss function and lstm network: 1, 2
- loss function and lstm neural network: 1
- loss function and lstm short term memory: 1, 2, 3, 4
- loss function and machine apply: 1
- loss function and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23
- low dimensional and lstm model: 1
- low dimensional and lstm network: 1
- low dimensional and lstm short term memory: 1
- low dimensional and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
- lstm architecture and machine learning: 1, 2, 3, 4
- lstm model and machine apply: 1
- lstm model and machine learn: 1
- lstm model and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31
- lstm model prediction and machine learning: 1, 2
- lstm network and machine apply: 1
- lstm network and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
- lstm neural network and machine apply: 1
- lstm neural network and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19
Co phrase search for related documents, hyperlinks ordered by date