An exploration of various interfaces for musical expression led to the creation of an interface that allows a person who lacks the use of their limbs to be able to perform any composition created in the Ableton Live digital audio workstation. This system was devised using a combination of the Unity gaming engine, Max for Live, and Ableton so that the interface would be easy to install and as flexible as possible. In addition, the system utilizes common eye tracking hardware made by Tobii Dynavox that may already be in use by a person who has limited functionality of their limbs. In order to demonstrate the feasibility of the newly created interface, a composition was written and performed. During the creation of the composition and rehearsal of the performance, the product was refined until it functioned properly and could allow some amount of improvisation. The result was an interface that could control any parameter within Ableton using a combination of eye gaze and head motion. Furthermore, the user can trigger scenes within Ableton to control the flow of the composition. This product could conceivably allow a person with disabilities to completely compose a piece and perform it without the aid of another person.
This project was a part of a master’s thesis to fulfill the requirements for the Master of Music in Music Technology in the Department of Music and Performing Arts Professions at the Steinhardt School, New York University.