Blacksburg, VA | June 3-6 2018


The International Conference on New Interfaces for Musical Expression gathers researchers and musicians from all over the world to share their knowledge and late-breaking work on new musical interface design. The conference started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001. Since then, an annual series of international conferences have been held around the world, hosted by research groups dedicated to interface design, human-computer interaction, and computer music.

The theme of the 2018 international conference is “Mirrored Resonances.” In part, it reflects a shared conference format shared between two institutions seeking to bridge a physical divide through technology. More importantly, it is designed to encourage expressions of human-computer interaction that exceed simply sharing audio-visual content. Similarly, the theme hints at the tighter integration among arts and sciences, music and engineering. We see the theme serving as the foundation for an integrative approach that will engage artists and engineers alike.

The conference will feature NIME research using distance technology, telematics, remote human-computer interaction, and virtual embodied immersion. We will present networked concerts between UVA and VT, and select events will be live-streamed to the outside world. While the conference will take place physically at VT, we envision musicians at the two host universities performing several joint telematic concerts utilizing Internet2 high-speed network connection. With its theme we would like to challenge the NIME community to imagine network music beyond connectivity, seeking new modes of interaction that defy physical separation. Our theme is intended as a source of inspiration, and will not dampen our excitement to showcase the wide variety of NIME research that may not fit the theme.

Conference participants will be able to make use of the new Virginia Tech Institute for Creativity, Arts, and Technology and the Moss Arts Center facilities, including the Cube, Fife Theatre, Sandbox, Perform, Learning, and Create studios, as well as the Digital Interactive Sound & Intermedia Studio and select School of Performing Arts facilities. These spaces will be networked with University of Virginia’s OpenGrounds, Music & Motion Capture Lab, Telematic Stage, and concert halls.


Telematics or network performance synthesizes traditional forms of media and information in a networked context, bringing issues of embodiment, interactivity and distance to the foreground. Theorist and digital artist, Roy Ascott wrote, “Telematic culture… enables one to participate in the production of global vision through networked interaction with other minds…” In the spirit of the NIME 2018 theme, Mirrored Resonances, we seek works that explore distance and interaction, network and distribution, and remote communication. NIMEs such as Hyper-instruments and augmented instruments offer new ways to explore musical embodiment across networks. Audience interactive systems enable external listeners/viewers to become integrated into a remote performance as active agents across networks. Distributed ensembles, perhaps in concert with robotic prosthetics and avatars expand the notion of “ensemble”. Symbolic transmission of performance information create new possibilities for visual and auditory experience of live music and compositional score generation. From real time composition systems enabled by NIME/network interaction, real time remote performance, motion-enabled live electronics for synthesis and spatialization, network-mediated social interaction, and more, we seek works exploring the rich possibility of telematic music.


Sonocybernetics explores the intersection among the human, computer, and sound. The ensuing creations range from cybernetic exoskeletons and implants, to biofeedback devices designed to enhance our perception, presence, as well as a sense of connectedness and empathy. Sonocybernetics can also manifest via the immersive technologies, such as Virtual, Augmented, and Mixed Reality technologies. As part of NIME 2018 we encourage projects that explore this newfound whitespace and the ensuing opportunities for enhancing and augmenting human capacity for musical creativity.


When sounds are produced electromechanically over loudspeakers, rather than mechanically on visually apparent objects, such as traditional musical instruments, a link with observed reality is broken, and an internal, imaginary world is engaged. Immersive environments raise the question of how vivid artificially created imaginary worlds can be. In the case of high-density loudspeaker arrays (HDLAs), both the considerably increased loudspeaker density, and periphonic projection of sound inaugurate the possibility of engulfing of the listener in sound. How well can the vividness of sound fields created by HDLAs mirror the richness of the natural acoustic world? In what ways can an HDLA be treated as a NIME? These are some of the questions we invite participants to join us in exploring in this thematic area.


Music making is an activity designed by humans for humans. Its inherent power is to bring people together. With the unprecedented ubiquity of technologies capable of empowering individuals and groups to engage in the unique experience of making music regardless of their educational background or location, NIMEs have a unique opportunity to democratize access to music. Doing so has a potential to extend the profound impact of music making among all generations, including the K-12 education. As a result, NIMEs can also facilitate community building, empathy, as well as social and cultural impact. As part of NIME 2018 we would like to encourage submissions designed to strengthen this mission and the democratization of NIMEs.