EMAIIL

Adam Schmidt, School of Music, Theatre & Dance

Collaborators: Michael Gurevich, Associate Professor, SMTD; Joe Gascho, Associate Professor, SMTD; Brent Gillespie, Professor, Robotics & Mechanical Engineering; Jeff Snyder, Professor of Musicology Princeton; Andrew McPherson, Professor of Design Engineering and Music at Imperial College London; Julie Zhu, SMTD; Qi Shen, SMTD

The EMAIIL is a multidisciplinary research project in which I plan to collaborate with engineers, composers, and performing musicians to establish new areas of inquiry surrounding new musical compositions for custom-built electromagnetically actuated musical instruments and interfaces.

“Electromagnetically Actuated” means using the principles of electromagnetism to generate vibrations and forces in mechanical systems to create acoustic sounds or tactile/haptic feedback. This research will infuse novel and creative engineering practice and craftsmanship with new music composition.

The research questions that will lead this new area of inquiry are:

  • Can composing for new musical instruments and interfaces be performed in parallel with their development rather than sequentially? Or taking things further, how can interaction design for these devices be driven and motivated by composition?
  • How can we document and transmit knowledge surrounding these bespoke and hand-crafted haptic interfaces, and does the creation of the interface itself become part of the composition?
  • How can electromagnetic actuation unearth never before realized sounds in harpsichord? What about newly invent musical instruments?
  • How can electromagnetic actuation unearth never-before-realized tactile sensations in novel haptic controllers?

EM Harpsichord: During the Fall 2023 semester, Professor of Harpsichord Joseph Gascho and I built a prototype of a first-of-its-kind electromagnetic harpsichord. A traditional harpsichord appears similar to a piano, but the strings are plucked rather than struck and typically made of brass or iron rather than steel.

The electronic system we developed is able to excite and sustain vibrations in the strings of the harpsichord, yielding new and interesting sonic results (listen to the links provided in supporting documentation). We are further developing this instrument in the Winter 2024 semester to enable external control of the vibration strings via the Musical Instrument Digital Interface (MIDI) protocol – a communication protocol that will allow external devices other than the instrument’s own keyboard to communicate with and control the instrument.

However, this exciting endeavor excludes a key feature inherent and important to most traditional musical instruments: tactile feedback felt by the fingertips and body. Each key of the instrument pushes a small plectrum against the strings, exciting them with a plucking motion. Before the pluck occurs, the musician’s fingers can feel the spring force of the strings pushing back on their hand through the plectrum which is mechanically tied to the key mechanism which creates a clicking sensation when the plectrum moves past the string.

Professor Gascho has indicated the significance of this tactile sensation when playing the instrument, motivating the Electromagnetic Harpsichord team to explore how we make this instrument externally controllable yet still provide tactile force feedback.

This is where explorations of new haptic interfaces will come into play – re-establishing the tactile haptic sensations that would otherwise be missing between instrument and interface.

Haptic Controllers: Haptic controllers typically generate vibrations and forces to invoke tactile feedback in digital systems. An everyday example of haptic interaction is the false click generated in many smartphone screens and laptop touchpads that you may have noticed are not present when these devices are not powered on. These ‘clicks’ are generated by digitally-controlled motors to provide the illusion of interacting with a real mechanical system such as a button.

Haptic systems are becoming increasingly prominent in academic research, virtual reality, gaming, and more, but the price tag of off-the shelf haptic systems can make them inaccessible to casual makers or electronic musicians.

My DIY Haptic Controllers: In the Winter 2023 semester, I began upcycling discarded 3.5” hard drives into ultra low-cost haptic interfaces for the purpose of musical expression. This started as a recreation of a simple haptic controller called The Plank by Bill Verplank (2002).

With 20 years having gone by since that work was published, the accessibility of high quality electronic components has dramatically increased and it could be recreated with off the shelf parts for just a few dollars per unit. Recreating this controller at a reasonable price point has motivated me to explore additional ultra-low cost methods of providing tactile feedback in expressive musical controllers.

Using upcycled electronics and digital fabrication, I plan to create a suite of several low cost haptic controllers used primarily for controlling electromagnetically actuated musical instruments. These controllers will use various methods of sensing and actuating (optical, electromagnetic, inertial, etc.) that I have been recently exploring for generating sound and motion in electromechanical systems.

Rough sketches of some of the first interfaces I plan on creating appear in the supporting documentation. In parallel with the development of these new and novel interfaces, I plan to collaborate with composers to infuse a driving force for making creative design choices.

When developing new interfaces in the past, I have found myself trying to design and program a very generalized, flexible, and hyper-optimized solution that can then later be applied to music. In engineering, there is always room for improvement, so I sometimes find myself spinning my tires while digging deeper and deeper into the weeds.

For this next stage of my research, I want to flip this paradigm and allow composition to drive creative engineering choices rather than developing over-generalized solutions. Instead of working to engineer perfect and predictable haptic interfaces, I aim to explore and exploit the crunchy and unpredictable edge cases of sensors and actuators to relinquish total control and invite musical expression to arise from rapid prototyping, sloppy code, and experimental composition.

These current and new haptic interfaces will be able to play nicely with the next phase of Electromagnetic Harpsichord research. The physical decoupling of the Electromagnetic Harpsichord’s sound generation from its interface enables rapid prototyping of new compositional and interaction ideas.

The interface no longer needs a 1:1 mapping between each key and note, and MIDI playback allows for programming of the instrument that does not necessarily have to be humanly possible were it to be performed with human hands. Composers can then not only compose for new sonic outcomes, but also compose the gestures that control them. As the toolmaker and technologist, it then becomes my responsibility to bridge the gap between gestural ideas/inputs and sonic outputs and enable their musical ideas.

This Electromagnetic Harpsichord will be the primary instrument in which my collaborators compose for, and the newly invented interfaces will explore how we can totally reimagine how a 15th century instrument can sound and be played in the 21st century. This research will be a significant component of my Master’s Thesis, which is largely going to be centered around the research proposed above.

This research opens new opportunities in human-computer interaction (HCI), telematic music, composition experiments, and more. This project will culminate in a combined public demonstration of and performance of new compositions for Electromagnetically Actuated Instruments and Interfaces in late April as a component of my thesis defense.

Personal Career Goals and Development: I aspire to design and build musical instruments and develop novel musical technology professionally. Many past projects and experiences have helped develop my technical chops in electronics, digital fabrication, and craftsmanship required to execute this project, but these experiences have lacked sustained collaboration or structure.

I have reached a point in my creative practice where I feel it is time to begin focusing on developing a self-directed studio practice that encompasses my personal aesthetics, guiding design principles, and structured collaboration and design cycles.

Using this project as a first go at this, I hope that scheduling performances and deadlines will provide structure for this collaborative research that will bolster my creative throughput and contributions in research communities such as New Interfaces for Musical Expression (NIME) and ACM Human Factors in Computing Systems (CHAI) and improve my ability to conduct independent research and establish myself as a competent designer, artist, and inventor in these fields.

Personnel, Expertise, & Resources: I will be leading this new area of research, but it will involve numerous collaborative efforts with faculty, industry professionals, and peers. I possess many of the necessary craft-based and engineering skills to get started but have already scheduled consultations with Music Technology leaders Jeff Snyder and Andrew McPherson to help review some of my designs and further brainstorm creative engineering solutions for electromagnetic actuation in musical instruments.

Additionally, professors Brent Gillespie, Michael Gurevich, and John Granzow (all of whom are on my MA thesis committee) will be able to provide me guidance and point me towards research surrounding acoustics, human-computer interaction, haptics, and more.

I have already begun conversations with several composers that are interested and invested in this project, many of whom contributed to formulating and developing this research plan. Connections I made at the NIME 2023 conference have been valuable resources in technical and philosophical development of my designs and are now shifting into longer-term collaborations.

Timeline: From now until May 2024, many components of this project are already underway. In particular, the Electromagnetic Harpsichord is continuing to be developed and scheduled to have a completed version by the end of the W24 Semester.

In the beginning of the W24 semester, I will begin collaborating closely with composers, conducting quick exercises and experiments that set the stage for the rest of my thesis work.

A public performance/demonstration of EMAIIL is to be scheduled for mid-late April, where new works for the EM Harpsichord and my interfaces will be premiered. I have all of the resources necessary to conduct this research available to me between my personal home workshop and the PAT workshop.