Art Gallery

Sensing the Sound Web

We present a sound installation aiming to bring artificial life into the real world. Environmental patterns, including human behaviors, are captured by a distributed autonomous sensor network with eight nodes to form a soundscape. The uniqueness of the autonomous sensor network is that we employ artificial chemistry to control the sampling cycle of each sensor autonomously (i.e., sensors are not simply reacting to environmental changes but sometimes resisting them). A minimal nonlinearity introduced by the artificial chemistry can foster some unexpected spontaneous temporal oscillations in sampling cycles, which we call the resonating state. The resonating state can vary drastically, depending on the space and time context, and also as a function of particular parameter settings. An internal dynamics memorizes past experiences and learns environmental affordances. As a result, resonating states are spontaneously generated and the system's long term behavior is studied for its ability to produce life-like phenomena. The system is put in a white cube and by walking around the space, visitors experience different soundscapes generated by the two automatically controlled parametric speakers.

Takashi Ikegami
The University of Tokyo

Mizuki Oka
The University of Tokyo

Norihiro Maruyama
The University of Tokyo

Yu Watanabe
The University of Tokyo

Akihiko Matsumoto
individual artist