Friday, 17 December | 9:00 AM - 12:45 PM | Room 314

Presented in English / 영어로 발표 됨

Creating New Interfaces for Musical Expression: Introduction to NIME

Thursday, 16 December | 7:00 pm - 10:45 pm | Room 314

Due to advances in digital audio technologies, computers now play a role in most music production and performance. Digital technologies offer unprecedented opportunities for creation and manipulation of sound, but the flexibilty of these new technologies implies an often confusing array of choices for musical composers and performers. Some artists are using computers directly to create music and generate an explosion of new musical forms. However, most would agree that the computer is not a musical instrument, in the same sense as traditional instruments, and it is natural to wonder "how to play the computer" using interface technology appropriate for human brains and bodies.

A decade ago, the presenters of this course organized the first workshop on New Interfaces for Musical Expression (NIME), to attempt to answer this question by exploring connections with the better-established field of human-computer interaction. The course summarizes what has been learned at the NIME conferences. It begins with an overview of the theory and practice of new musical interface design and asks: What makes a good musical interface? Are any useful design principles or guidelines available? Then it reviews topics such as mapping from human action to musical output and control intimacy, and presents practical information about the tools for creating musical interfaces, including an overview of sensors and microcontrollers, audio synthesis techniques, and communication protocols such as Open Sound Control (and MIDI). The remainder of the course consists of several specific case studies of the major broad themes of the NIME conference, including augmented and sensor-based instruments, mobile and networked music, and NIME pedagogy.



Intended Audience

Interaction designers, artists, and academic and industry researchers who have a general interest in interaction techniques for musical expression.

Presentation Language

Presented in English / 영어로 발표 됨


Familiarity with the basics of interactive media, but particular technical background is not required. No background in music or computer audio is assumed.


Introduction to NIME Tools, Design, and Aesthetics 

So You Want to Build a NIME - Fels

Camera-Based Interfaces - Lyons

Design & Aesthetics of NIME - Lyons



Themes & Case Studies 
  NIME after NIME: Case Studies - Fels
  NIME Theory - Lyons 
  NIME Education - Lyons

Concluding Remarks - Fels & Lyons


Sidney Fels
The University of British Columbia

Michael Lyons
Ritsumeikan University

Instructor Bios

Michael Lyons
Michael Lyons, professor of image arts and sciences at Ritsumeikan University, earned his PhD in physics at The University of British Columbia. He has worked in computational neuroscience, pattern recognition, cognitive science, and interactive arts. He was a research fellow at the California Institute of Technology (1992-1993) and a lecturer and research assistant professor at the University of Southern California (1994-1996). From 1996-2007, he was a senior research scientist at the Advanced Telecommunications Research International Labs in Kyoto, Japan. He joined the new College of Image Arts and Sciences, Ritsumeikan University in 2007. He was a co-founder of the New Interfaces for Musical Expression conference.

Sidney Fels
Sidney Fels, associate professor of electrical and computer engineering at The University of British Columbia, earned his PhD at the University of Toronto. He leads the Human Communications Technology Laboratory and is Director of the Media and Graphics Interdisciplinary Centre. He has worked in human-computer interaction, neural networks, intelligent agents, and interactive arts for over 10 years. He was visiting researcher at ATR Media Integration & Communications Research Laboratories (1996-1997). His multimedia interactive artwork, the Iamascope, was exhibited worldwide. He created Glove-TalkII, a system that maps hand gestures to speech. He was co-chair of Graphics Interface '00. He was also a co-founder of the New Interfaces for Musical Expression conference.