top of page

Airjam: A Sensor/Gesture-Based

Musical Instrument

July - Aug 2014,  Jan - Apr 2015

User Research  |  Concept Development  |  Gesture Pattern |  Interaction Design  |  UI/UX

Airjam utilizes a wearable band with a gesture-based interface that plays music using gyro-sensors and EMGs.

Once a user wears the arm band, the service will play the tunes via the user’s gestures so that they can play along with an ensemble. This project was completed in collaboration with industrial designers and hardware/soft developers.  

MINDSET

Shift focus of the new product development

from being technology-centered to human-centered design.

BACKGROUND

When playing an instrument, the absorbing environment is significant as it allows users to immerse themselves in their musical expressions. However, many existing musical instrument applications are limited to touch in-the-screen experiences. Moreover, for users who are not skilled at playing instruments, musical expression is just a dream.

MY QUESTIONS IN THE INITIAL PHASE

Q1.   What are the elements that help to maximize the realism in playing musical instrument?   

Q2.   How can nonmusical users be encouraged to enjoy the performance?

PROBLEM DEFINITION

I conducted in-depth semi-structured interviews with users who have experienced musical instrument app to capture how they feel about the current interface and their unmet needs as well. Based on the research findings, I developed directions for a new input system and a mobile application that help users to perform sophisticated musical expression and to maximize the realism in playing musical instrument.

New Input

System

APPROACHES I TOOK

  1.  Defined what could be used as a new input that would allow the players to perform and how we allows multiple users play music interactively like a band or ensemble.

2.  Designed interfaces of application that support this project to achieve its goal.

I created user flows and once the interaction flows were determined, I drew each screen in more detail. When designing user interface, I focused on simplifying complex components so that users could focus on performing without being distracted by UI.

Based on the wireframes, I created paper prototypes to depict user side interaction with the product. Interactions were validated with developers. 

3. Worked on identifing gesture pattern as an input to the system.

I implemented FGIs with professional musicions in order to understand principle of playing and characteristics of each instrument.

 

Plus, several times of usability tests were conducted with developers in order to identify how the sensor technologies (electromyogram & gyro sensors) could work moderately well with player's playing patterns and solve some technical issues.

PROJECT GOAL

1. To develop a digital musical instrument with new input system that embodies the characteristics of analog

     musical instrument by reuniting controls and sound production together. 

2. To provide a interface which allows multiple users play music together like a band.

Iteration

- Replaced the instrument selection bar to a floating action button to reduce the space for it but users can still promptly changing the instrument. When pressed, it displays different instruments.

 

- Changed the structure of the screen by replacing the player toolbar to the bottom and adding a side nav, app bar/primary toolbar to make it more scalable because there were more features added in later.

- Added an visual indicator to show the current position for user interaction, which can help users to perform more sophisticated musical expression and to maximize the realism in playing musical instrument.

2nd Version

DESIGNS DECISIONS

In the first version, I had a fixed global navigation area for users to easily change the instruments and settings regardless of which screen they are. However, it took too much unnecessary space especially when playing the instrument. Also, users got confused that the player toolbar was on the top because it was not the primary feature they would do that often. 

1st Version

Finding

Intuitive Interface
Gesture-based interfaces and dynamic visual feedbacks make users to perform sophisticated musical expressions.
WeJam: Play together
Once the user wears the wearable band, the service will play the tunes via the user’s gestures so that they can play along with an ensemble.
Turn your movements into music.
Enjoy the incredible freedom of movement and create your own music with a variety of instrument kits that make you feel like
a musicion.
Final Words...

Overall, I underwent a process of trial and error including technical issues such as gesture mappings, learning and applying musical knowledge, and planning for the exhibition. This project was very meaningful as I experienced the technical cooperation involved with other designers and developers, actualized ideas, and learned partnership and time management.

bottom of page