VAVE

Gesture based musical instrument

Problems addressed

Exploring different ways to interact in mixed reality

Key Contribution

Using sensors that record hand gestures for playing music

Scroll to see the whole journey below

INTRODUCTION

Project Background

Vave, a 3-week long project in collaboration with Ameya Nikose, is an exploration of mixed media interaction techniques by creating a virtual gesture-based instrument. The project’s motivation was to learn to work with augmented/virtual/mixed reality and create an enjoyable musical experience.

Create an enjoyable musical experience while exploring interactions in augmented/virtual reality.

Final Output

The final product was a handy instrument played by waving and making other hand gestures over it. The physical instrument was to have a tangible mode of interaction. It was finally interfaced using Unity with visualizations that could be seen in VR for visual feedback.

INSPIRATION

An Instrument or a Game

To create an enjoyable music experience, we started by learning about recent technologically driven musical instruments, games, and experiences. The first challenge that we faced was to define whether we were going to create an instrument or a game. We looked at Beat-Saber ( an existing musical game in virtual reality). Initially, we came up with some ideas on whether to carry forward with creating an instrument or a game.

After speculating a bit about the idea of an instrument, we selected the concept of creating an instrument.

Inspiration for creating an instrument

To create an enjoyable experience, we again dived deeper and looked for existing technical music instruments and bodily experiences to create music. The inspirations for digital musical instruments were of theremin. We also got inspired by fiction like Mr. Bean for the gestures and emotions in playing music.

Theremin- Playing Music in Thin Air

The theremin is an electronic musical instrument controlled without physical contact by the thereminist (performer). The instrument's controlling section usually consists of two metal antennas that sense the relative position of the thereminist's hands and control oscillators for frequency with one hand and amplitude (volume) with the other. The electric signals from the theremin are amplified and sent to a loudspeaker.

Using Hand Movements in Mixed Reality

The video from Yago De Quay about Augmented Reality Musical Instrument has a very interesting concept of controlling the song/music parameters by hand movements using motion capture technology.

Musical Installation

We also looked at different experiences like piano-tap-dance, where the players played a huge piano by dancing on the notes to play it.

Mr. Bean- Gesture and Emotion Driven

Finally, we had an idea of creating a musical instrument that is solely gesture and emotion-driven. This can be explained by this video of Mr. Bean, where he uses emotions to drive and conduct the choir. Wouldn’t it be awesome to play an instrument like that someday!

EXPLORATION

Exploration for Creating an Instrument

After getting inspired by looking at some wild things on the internet, we decided to also look into the technical part of the process and explore various media. So we started exploring different ways of building the instrument simultaneously.

We had a lot of fun trying to learn about and quickly prototype different interaction techniques in AR/VR/MR. As we did not have access to existing controllers, we had to look at other alternatives.

Figuring Out Ways to Interact

We looked at methods to interact without a controller, like using a raycast reticle to point and shoot using a phone, using virtual buttons on an image marker in AR, and using onscreen buttons in AR.

VR Gaze

One simple way to interact in VR was by using the accelerometer of the phone and the Raycast Reticle feature in Google VR SDK that tracks where the person is looking/ gazing and triggers a pre-defined interaction with that element. Here in our prototype, when a person gazes at a cube, the cube changes its color from white to red and if the person continues to stare for a specified time limit, it changes to black.

Learnings: A simple inbuilt interaction but moving around a lot to point the phone at elements in space could be tiring.

On-screen buttons in AR

In our next prototype, some keys pop up in the AR space and when these are pressed on the screen, a song is played.

Learnings: Although in AR, the experience was like any other 2D application and not enjoyable.

Mixed-Reality and Virtual Buttons

We also prototyped wearable AR by making a stereoscopic AR camera in Unity. This helped in having a tracker-based AR with virtual buttons, at the same time keeping your hands free, unlike hand-held AR using a mobile device. As the hands were free, we could add some virtual buttons in the 3D world space that could be interacted with using our hands.

Learnings: Interactions became more immersive but the virtual buttons only work when visible through the camera.

Making Our Own Controllers

We also tried to make our own controllers with simple sensors like accelerometer, gyroscope, and ultrasonic sensor and map hand movements recorded through them to different musical variables.

Learnings: As the sensors were independent from AR camera, one could look anywhere while interacting with the controllers.

This also enabled a more tangible mode of interacting and we decided to go further with this.

But Where Is The Music In All This? - Audio Helm

Another challenge was to produce musical sounds in Unity based on the feedback from sensors/ virtual buttons/ raycast. We used the Audio Helm plugin, which is a live audio synthesizer, sequencer, and sampler for Unity that gives you the tools to create dynamic sound effects and generative music. We then tweaked the code to control audio helm variables according to the input from sensors to make it interactive with gestures.

NARROWING DOWN

Gestures for Playing Music

For a holistic seamless experience, we not only had to figure out the technical aspects but also think about the usability and experience design aspects. In order to decide on the gestures that will be intuitive and easy to perform to control the music, we did some role play and act-it-out by simply doing actions that pretend to control existing music. These wizard of oz prototypes helped understand the different hand movements that are enjoyable and natural for controlling different music variables like pitch, volume, sustain, vibrato. We also took inspiration from the Mr. Bean video.

Implementation- Wearable Controller

For connecting the intended design with the technology, we had to figure out the implementation for the different gestures. As most of the gestures were free-flowing hand gestures, we decided to go with a wearable glove with sensors. A 6 DOF accelerometer was used to get the rotational and transnational values of the hand movement. We added some modalities with buttons on the fingers. One could pinch with a thumb and index finger and move the hand as if holding and moving a virtual slider to control any variable.

Hand model by @Chandan1 from CGtrader

Discrete vs Continuous Music

One of the decisions that we had to make was whether to put a smooth transition between two semitones (like in a theremin) or to jump directly between semitones. Finally, after testing both the scenarios, we decided to put just the semitones as the notes between two semitones were difficult to play.

PROTOTYPE AND TESTING

Tangible Instrument for Virtual Reality

On our concept so far, we got feedback on including something tangible to interact with. Having intimacy and interacting with another instrument makes the experience of playing music more enjoyable and also something that we are used to. Considering this, we designed a simple wave-like form that is coherent with the kind of interaction we want. It’s a light instrument that can be held with one arm like a child and played with by waving the other hand. It contains an ultrasonic sensor at one end, and its readings have been calibrated to play different notes of a scale on each crest and trough of the form. For the prototype, the instrument contains one octave of G Major scale.

Crests and Troughs for Tactile Feedback

Curves to Easily Fit in Hand

Physical Prototype and Role Playing

Back end- sensor, Arduino, and interfacing in Unity

The main component of the instrument was the ultrasonic sensor that measured the distance of the hand from one end of the instrument. This was recorded using an Arduino Uno, and the Arduino data was sent to Unity using serial port communication. In Unity, we used C# and Audio Helm to play a musical note based on the position of the hand. Another component was an accelerometer that recorded the angle of the hand to control the volume (flatten the hand on the instrument to lower volume and raise upright to increase.) The accelerometer also measured the vibrations of the hand at a position to play vibrato. We tried to optimize the code by setting an optimum range of the sensor and delays in sending data to Unity, so we don’t overwhelm unity with a lot of data creating a lag at the same time sending enough data to keep it precise to the hand movements. However, there was still some amount of lag with all the sensors and data coming in, so we decided to just go with the ultrasonic sensor for our MVP to play different notes with the hand movements.

Virtual Landscape

We also needed some visual feedback for a complete experience. All this was interfaced using Unity itself. We used a sphere in the world space in VR that moves with the movement of the hand (the position of the sphere corresponds to the data from the ultrasonic sensor). On the path of the sphere, we laid some bars. When the sphere collides with a bar, it glows/ ripples, and a musical note corresponding to the bar is played using Audio Helm. For the working prototype, we kept it simple, but we also created richer visualizations that could enhance the experience of playing the instruments.

As the instrument is in VR, we decided to propose the possibility of a collaborative virtual environment where different people can come together for jamming. “Near far wherever you are, I believe in my VAVE, the show must go on and on…”

Final Prototype

Testing- Given To Use Without Instructions

CONCLUSION

Feedback and Future Scope

The experience of creating music was enjoyable indeed. People were fascinated by being able to create music by vaving hand in the air. The gestures were also perceived as intuitive, natural and enjoyable. We had not instructed the users about how to play the instrument as we wanted to see if they are able to figure it out. The users (who were familiar with playing some instrument and how to play music) were able to figure out the interactions on their own. Some scope for improvement regarding the instrument was regarding the scale and limited octave. There had to be ways to calibrate for different scales and extend the octave. One option could be to move between octaves similar to applications like perfect piano i.e. by sliding but that could hamper the experience. Another option could be to have interactive area even outside the physical instrument.

Learnings

Through the project we learnt technical skills like using arduino, sensors, Unity, etc. We also learnt design sills like how to quickly prototype and test ideas or get expert Feedback. It was also exciting to think about tangible interactions and gestures to control musical variables.

Logo

Check Out More

A R car

Interactions in Augmented Reality

The project explored the topic of How Stuff Works using Augmented Reality as a medium. For the scope of this project, I explain how an IITB Racing car works, through a hand held device AR. Looked at key interaction design problems.

MAY 2020 |   AR design   Interaction design   Unity prototyping

POUR

Assistive Device for Blind Users

Designed an assistive device for blind users to help safely pour hot liquids. The project explored cognitive ergonomics and how the form of the product could communicate useful information, providing a feedback loop for the blind.

FEB 2020 |   Product design   Tactile interactions   User study

Paper Potli

Game for Gender Equality

A craft + AR avatar making game for making children gender aware. Designed an end to end model for promoting gender equality among children through craft workshops and a digital application. The DIY dolls help explain, express and explore gender.

OCT 2019 |   Game design   Social design   User study

Designed with 🖤 in Lucknow