Real-Time Spatial Music Composition System (RTSMCS) Version 2.0

Real-Time Spatial Music Composition System (RTSMCS) Version 2.0
Submitted By: Nativ Adler, Nathan Korman, Shiri Kuzin and Yossi Tamim.

The project was supervised by Dr. Dan Feldman from the Department of Computer Science and by Dr. Alon Schab

from the Department of Music, and was developed as part of the Musicological Lab initiative.

The first Real-Time Spatial Music Composition System (RTSMCS) was created by Daniel Lederman.

The main goal of our project was to create an inspiring musical instrument.

The system uses Motive to capture real-time coordinates using six cameras and a glove with sensors.

The location of the glove in space is represented by three coordinates: X, Y, Z that each of those controls a different variant:

The X-axis controls the frequency

The Z-axis controls the amplitude (volume)

The Y-axis controls the number of waves which define the complex wave

The basic wave is formed and played in real-time meaning the sound manipulation is heard instantly.

The system was developed using the following tools:

Development language: Python 2.7
IDE: JetBrains PyCharm
External python libraries: pyo, mido, pygame, optirx, nibabel
Demonstration:

Accessibility
Close Close