For as long as people have been designing digital musical instruments, designers have been thinking about how to make these instruments accessible for people who may struggle to actively participate in music-making because of cognitive or motor constraints due to an inherited or acquired disability.
From the work of the Drake Project, based in the UK [1], to the work of Pauline Oliveros and her colleagues on AUMI [2], the impulse to facilitate collaborative music making by means of digitally-enhanced instruments has often exceeded, in creative approach and scope, mainstream work to make generic HCI accessible. This is partly because work on Accessible Digital Musical Instruments (ADMI) is not constrained by the need to deliver access solutions that fit with pre-existing hardware and software environments (e.g. keyboards/mice and Windows/Mac interaction paradigms), and partly because digital musical instruments (DMIs) already contain a wider array of sensors to track a broader range of human movement.
As with much work in general HCI, the development of ADMIs has engaged with people with disabilities primarily in the role of end-users, even when such work has purported to follow an ‘inclusive design’ approach. TAMIE sets out not only to embody an inclusive design approach, but also to build an environment where users themselves ‘train’ their own instruments to respond to their movements.
Taking seriously the dictum of the Third Paradigm in HCI, which asserts that we design for the bodies of diverse users working in diverse environments as the norm rather than the exception [3], TAMIE will develop a platform that combines sensing hardware, musical synthesis tools and an open mapping layer, supported by machine learning, that a user can use to map sensed action to musical output.
TAMIE takes advantage of the fact that, As Gillian and others have noted, in designing DMIs of any kind, it is more important for a system to recognize all of the gestures of a player all of the time, in contrast to general-purpose gesture recognition systems (e.g. on phones) which can get away with recognizing everyone’s gestures most of the time [4]. In other words, as long as a player can reproduce a gesture consistently enough to fall within the constraints of a classification system (which itself can be programmably tuned using a gesture learning model), an instrument can be configured to respond to their movements in ways that they themselves can determine.
TAMIE builds primarily on three areas of prior work.
- On the pre-existing examples of ADMIs and their evaluation that can be found throughout the NIME, CHI, and TEI literature.
- On the growing body of research concerning the role of bodily movement as a design resource when developing interactive systems, and
- On current work in AI and machine learning for gesture recognition in HCI in general and DMI design in particular.
Inclusive Design Partners
I will use my extensive network of end-users who live with a variety of disabilities and who are also amateur or professional musicians to recruit a core group of co-designers. In addition, the project will also recruit smaller groups of naïve users to evaluate the project during its lifecycle.
Dissemination
In addition to publishing work produced with student and end-user participants in venues such as NIME, TEI and CHI, the project will conclude with a concert that will provide any interested participants with the opportunity to share their work and their art with the broader UofM concert-going community. We will also create a public-facing website to share findings, recruit participants and showcase performances.
Resources:
The work of this project will draw on the resources of the Tech Suite in the PAT department. Prototype physical interfaces, incorporating a variety of physical controls and sensors, will be developed using the workshop facilities, while the Davis Studio space will be used for lab meetings and for workshops with participant designers. The facility is fully accessible, which is of greatest importance for this work.
References:
[1] The Drake Music Project
https://www.drakemusic.org
[2] Tucker, Sherrie (2024). Improvising Across Abilities: Pauline Oliveros and the Adaptive Use Musical Instrument. University of Michigan press.
[3] Harrison, S., Tatar, D., and Sengers, F. The Three Paradigms of HCI
(2007) HCI Journal, Taylor and Francis
[4] Gillian’, Nicholas. Gesture Recognition for Musician Computer Interaction (2011); Ph.D. thesis, Queen’s University Belfast.
Students apply to a specific role on team as follows:
All students will work to design the architecture of the underlying platform, in order to provide them with an opportunity to understand the process from a systems prospective and to engage in the iterative lifecycle of the project.
Programmer (2 Students)
Preferred Skills: Students with programming experience and a knowledge of current AI techniques to develop the tools that users will use to train their systems to recognize their gestures.
Likely Majors/Minors: ARCH, CE, CS, ECE, EE, PAT, ROB, SI
Audio Coding & Data Mapping (2 Students)
Preferred Skills: Students with skills to develop the synthesis and control frameworks onto which gestures will be mapped. Experience with the software MaxMSP preferred.
Likely Majors/Minors: CE, CS, ECE, EE, PAT, ROB, SI
Interface Designer (2 Students)
Preferred Skills: Students with skills to develop the user-facing interface to the system, and to carry out evaluations of systems during their development.
Likely Majors/Minors: ARTDES, CE, CS, ECE, EE, PAT, ROB, SI
Faculty Project Lead
Sile O’Modhrain has worked as a researcher and faculty member both in the U.S. and abroad, including at the prestigious MIT Media Lab, Media Lab Europe, and at the Sonic Arts Research Center at the Queen’s University of Belfast. She has also worked for BBC Radio as an audio engineer and program producer. Her research focus is on haptics–touch and gesture–and its relationship to music performance, and on the development of new interfaces for technology-enhanced instruments that extend the boundaries of musical expression. Also impressive is her combination of experience in many areas related to audio, psychoacoustics, computer music, cognition, and gestural control of music. She is internationally known and respected in her field, as evidenced by her record of scholarly accomplishment in well-regarded journals and as a frequent speaker at international conferences.
Students: 6
Likely Majors/Minors: ARCH, ARTDES, CE, CS, ECE, EE, PAT, ROB, SI
Meeting Details: TBD
Application: Consider including a link to your portfolio or other websites in the personal statement portion of your application to share work you would like considered as part of your submission.
Summer Opportunity: Summer research fellowships may be available for qualifying students.
Citizenship Requirements: This project is open to all students on campus.
IP/NDA: Students who successfully match to this project team will be required to sign an Intellectual Property (IP) Agreement prior to participation.
Course Substitutions: CoE Honors