Monday 12 December | 14:15-18:00 | Room S226 + S227
Advances in digital audio technologies have led to a situation where computers play a role in most music production and performance. Digital technologies offer unprecedented opportunities for the creation and manipulation of sound. However, the flexibilty of these new technologies implies an often confusing array of choices for musical composers and performers. Some artists have faced this challenge by using computers directly to create music, leading to an explosion of new musical forms. However, most would agree that the computer is not a musical instrument - in the same sense as traditional instruments - and it is natural to ask 'how to play the computer' using interface technology appropriate for human brains and bodies. A decade ago we organized the first workshop on New Interfaces for Musical Expression (NIME) to attempt to answer this question by exploring connections with the established field of human-computer interaction. This course summarizes what has been learned at NIME. We begin with an overview of the theory and practice of new musical interface design, asking what makes a good musical interface and whether there are any useful design principles or guidelines available. We will also discuss topics such as the mapping from human action to musical output, and control intimacy. Practical information about the tools for creating musical interfaces will be given, including an overview of sensors and microcontrollers, audio synthesis techniques, and communication protocols such as Open Sound Control and MIDI. The remainder of the course will consist of several specific case studies representative of the major broad themes of the NIME conference, including augmented and sensor based instruments, mobile and networked music, and NIME pedagogy.
Level
Beginner
Intended Audience
The course is designed to have a broad appeal to interaction designers, game designers, artists, and academic and industry researchers who have a general interest in interaction techniques for multimodal musical expression.
Prerequisites
Attendees should be familiar with the basics of interactive media, but do not need any particular technical background. No background in music or computer audio is assumed.
Course Schedule
14:15 - 16:00 Session 1: Introduction to NIME Tools, Design, and Aesthetics
14:15-14:30: Introduction (Fels & Lyons)
14:30-15:10: Module 1: So you want to build a NIME (Fels)
15:10-15:30: Module 2: Camera-based Interfaces (Lyons)
15:30-15:50: Module 3: Design & Aesthetics of NIME (Lyons)
15:50-16:00: Discussion
16:00 - 16:15 Break
16:15 - 18:00 Session 2: Themes & Case Studies
16:15-16:50: Module 4: NIME after NIME: Case Studies (Fels)
16:50-17:15: Module 5: NIME Theory (Lyons)
17:15-17:30: Module 6: NIME Education (Lyons)
17:30-17:45: Concluding Remarks (Fels & Lyons)
17:45-18:00: Discussion
Presenter(s)
Michael Lyons is a Professor at the College of Image Arts and Sciences at Ritsumeikan University. He obtained his PhD in Physics from the University of British Columbia. Michael has worked in computational neuroscience, pattern recognition, cognitive science, and interactive arts. He was a Research Fellow at Caltech University (1992-1993), and a Lecturer and Research Assistant Professor at the University of Southern California (1994-1996). From 1996 to 2007 he was a Senior Research Scientist at the Advanced Telecommunications Research International Labs in Kyoto, Japan. He joined the new College of Image Arts and Sciences, Ritsumeikan University, as Professor in 2007. Michael co-founded the New Interfaces for the Musical Expression conference.
Sidney Fels is an Associate Professor in Electrical & Computer Engineering. He obtained his PhD from Toronto University. Sidney Fels has worked in HCI, neural networks, intelligent agents and interactive arts for over ten years. He was visiting researcher at ATR Media Integration & Communications Research Laboratories (1996-1997). His multimedia interactive artwork, the Iamascope, was exhibited worldwide. Sidney Fels created Glove-TalkII, which maps hand gestures to speech. He was co-chair of Graphics Interface 2000. He leads the Human Communications Technology Laboratory and is Director of the Media and Graphics Interdisciplinary Centre. Sidney Fels co-founded the New Interfaces for Musical Expression conference.