Presented at the Future Directions of Music Cognition International Conference, March 2021
Co-contributor: Claire Arthur
This is a systematic analysis of how multiple voices move together in the choral works of the influential Renaissance composer Palestrina.
We found that Palestrina used different voice-leading patterns at different hierarchical levels (from one pulse beat to the next, vs. all note-to-note transitions). For example, moving to a perfect interval by parallel motion (a "forbidden" pattern in Renaissance music) occurred more often at the higher metric level. Factors associated with breaking the rules were similar at the different metric levels, but the directions of the effects were not always what we might expect from a traditional music theory perspective.
Tools: Python (music21, pandas), JMP, Excel
Poster presented at Digital Libraries for Musicology, July 2021
This paper describes a web-based interactive tool developed to explore a repository of sheet music visually. Starting with symbolic representations of the scores, we calculated high-level features like tempo, time signature, mode, and a composite measure of difficulty. Then we used a different visual element to represent each of those features, such as color, shape, and icons. With the interactive tool, you can explore the repository using the images, view the scores, and listen to MIDI versions and audio recordings. The use cases for this tool include organization of sheet music databases, similarity comparison and recommendations, and improving the user experience of searching for scores.
Presented at the Joint Statistical Meetings, 2017
Missing data is a common issue in survey research. If respondents skip some of the questions that are needed to calculate summary scales, we can use imputation to estimate those items. We compared imputation methods for a health-related quality of life measure, considering quantitative criteria such as bias and precision, as well as practical considerations like interpretability by a non-technical audience. We found that simple is best; weighing all the pros and cons, an easy-to-implement item substitution outperformed more complex methods like multiple imputation.
Tools: SAS, SQL, Excel
2021, co-contributor: John McNamara
How has harmonic complexity in classical music changed over the past 200 years? Because the rules about "acceptable" types of harmonies relaxed over time, we hypothesized that more recent pieces would contain more extended chords (i.e., 7ths, 9ths, 11ths, and 13ths) than earlier pieces. A systematic analysis of solo piano works (40 pieces divided into 50-year epochs), showed that the proportion of extended chords increased significantly over time. However, the effect size was smaller than we might have expected, showing incremental change in tonal harmony trends.
Tools: Python (music21, scikit-learn, matplotlib), Photoscore
Score formats: MIDI, musicXML, pdf
Exhibited in the Clough Art Crawl, Spring 2021 Virtual Show
This interactive code-based piece is about connection to our auditory environment. What happens when we really listen to the noise or silence around us? The noise transforms into something beautiful, but only if you are quiet.
Interactive code available at https://editor.p5js.org/llight/sketches/qn9pFri4p (requires microphone access)
Variation on Movement 1: Adagio Cantabile
He Zhanhao and Chen Gang (1959), arr. Debbie Shen (2019)
Co-contributors: Shauna Morrisey, Allyson Stout
In the spring of 2020, we set out to incorporate technology into a live chamber music performance. Derailed by COVID, we found an alternative, recording asynchronously in Soundtrap and creating visuals that represent the star-crossed lovers as they meet in the first movement. Each color represents one character/instrument in the visualizations and the corresponding butterfly animation. Using the Fast Fourier Transform, the overlaid frequencies illustrate the timbre of the instruments with their differing distributions of partials.
Tools: Soundtrap, Adobe Premiere, Adobe Animate, Actionscript
My role: flute performance, mixing, visualizations
This game develops kids' listening and instrument recognition skills. The baby animals, each represented by a different instrument sound, are looking for their parents. Listen to the baby sing, then match it up with its parent's voice (the same instrument in a lower register).
Tools: Adobe Animate, Actionscript, MuseScore