We Are Listen approached me to build a prototype to pitch an interactive installation piece to Brian Eno where users wear a HoloLens and create tones which are visualized as floating holographic orbs, very similar to the style of Brian Eno's well known "Bloom" iPhone app but 3D and in mixed reality.
Since the initial pitch and prototype, a full multi-user experience has been built out by We Are Listen for the public to experience at The Transformatorhuis in Amsterdam http://bloomopenspace.com/
The prototype explores these areas:
- Visualizing the tone of the sound (shader, animation and FX look development)
- How you create the sound producing orbs
- Placing the orb on the location of the user's hand
- Placing the orb in front of the users gaze. (in both cases the air tap gesture is used to initiate the action of creating an orb)
- What is the best input method for the hardware's limitations?
- Spatial sound design
- How many orbs can be active at once?
- How do we ensure a seamless experience if sounds must be capped at a certain number?
- How do we map input to the different pitches of sounds? How do we help the user discover how this mapping works?