Client
Georgia Institute of Technology
Servcies

Rapid Prototyping, Product Design, Arduino + Processing Programming, Sound Design

year
2020
Prototype

LEAP BEAT™

LEAP BEAT™ is a tool exploring speculative ideas of how musicians can embody the motion of sound. Could someone perform for a live audience, or record a track, without touching any instruments? This prototype engages curiosity, function, and play in ways that suggest an optimistic 'yes'.

The Process

What does it look like when I play a drum machine like a Theremin?

The LEAP motion controller is a tool utilizing infrared cameras to track hand motions, positions, and gestures. With Processing’s LEAP Motion Library, the interface is able to translate user actions into triggers for audio samples. These audio samples are handled by the Minim Library, and require Serial input from the physical Arduino circuit, which houses 3 buttons representing 3 system states. Each system state is associated with its own sample pack of loops and one-shots for users to play varying arrays of sounds.

Additionally, there is a visualizer feature within the interface that’s still a work in progress. For each action, there will be visual feedback from the system, along with associated sounds (e.g. A random background color is selected on every Key Tap Gesture).

How can people learn new ways to engage with musical experiences and education?

When multiple layers of machinery and interaction are removed, novice players can more easily embody sonic experiences while engaging a state of flow. The LEAP BEAT™ visualizer exploit sound and image in ways that enhance synesthetic interaction. Intertwining so many sensory experiences potentially increases the emotional ties that people build through play.

In this iteration of LEAP BEAT™, the sound triggering lacks any form of quantization. Adding quantization synched with a specified BPM or time signature would be the first improvement for the next iteration of this project. Plus, there's significant latency between a player's movements and the computer's reaction time that needs to improve. When people play music, the latency can throw off the essential feeling of real-time feedback, which is necessary to fully engage any level of expressive flow. The last improvement focuses on better symbolizing motion-tracking and interactive instructions that explain how to use the tool.

What's Next?

In this iteration of LEAP BEAT™, the sound triggering lacks any form of quantization. Adding quantization synched with a specified BPM or time signature would be the first improvement for the next iteration of this project. Plus, there's significant latency between a player's movements and the computer's reaction time that needs to improve. When people play music, the latency can throw off the essential feeling of real-time feedback, which is necessary to fully engage any level of expressive flow. The last improvement focuses on better symbolizing motion-tracking and interactive instructions that explain how to use the tool.

Details

1. Left Hand - Loop ambient/nature sound

2. Right Hand - Loop melody

3. Swipe Gesture - Trigger riser one-shot

4. Circle Gesture - Trigger sound FX one-shot

5. Screen Tap - Trigger high percussion one-shot

6. Key Tap - Trigger kick one-shot

Sources

Samples: Purchased from Splice.com

Leap Motion Library

Minim Library

Serial Library

openprocessing.org - https://www.openprocessing.org/sketch/757190/

Latest

Curabitur auctor metus et cursus feugiat phasellus tempus nibh non erat rhoncus ultricies