Duration: December 2019 - February 2020

Tools: Unity built-in / HLSL shader code

Roles: Senior digital artist, lead technical artist

Highlights

  • Live performance by Galantis using their live MIDI drumming setup

  • Real-time XSens mocap data streamed into engine

  • Custom MIDI based virtual drum visualizer system (using MIDI from their drum pads)

  • VR audience

  • Interactive live streaming (users see text appear on screen and can vote for different outcomes throughout the show)

 

About

In December of 2019, while I was working on a virtual concert for Tinashe at Wave, I was also tasked with generating an artistic style and direction for an upcoming virtual concert we would be doing with the live performance / DJ duo Galantis.

The show would be based off of their new album, with the concept of the “church of Galantis” as a unifying theme, church being a metaphor for a congregation of people united by their belief in community, support and appreciation for each other. The album cover employed many colorful and textural elements drawing from animals, insects and religious inspired architectural structures and patterns

The cover of the new Galantis album that we were building the show off of

Below is a gif of an animated concept I came up with, built in the engine we were using at the time, Unity (using the built-in renderer). I went with a low poly style to play off the idea of shiny crystalline objects and stained glass, and wrote a custom shader to animate a chain of objects into interesting spiral like forms.

This was before VFX Graph, and one of our technical requirements at the time was that we could not easily make custom C# scripts to include in the show, as our shows were packaged as asset bundles that got loaded into our platform dynamically, so any new code had to be added to the platform code and packaged up into a new build.

With this limitation, I made a system to prepare a mesh in editor (basically duplicating it and assigning custom vertex attributes to each duplication) and then using a custom shader I wrote, you could use the vertex shader stage to manipulate all of the duplications and have them rotate and twist to form these long tendril like structures. This was done with rotation matrix math and a for loop to inherit the rotations down the chain.

original concept I made for Galantis

Original concept I made for Galantis

The concept was a hit and was approved by the artists and Wave, so using the above animation and other images as reference, I conveyed the direction to two artists on our team, including the artist who would lead the art production on this show. Afterwards, I finished directing the show for Tinashe, and after a short break, quickly jumped on the Galantis show to help see it through till the end.

Apart from creating a lot of the visual content in the show and I also designed and built two systems I’m proud of given the constraints of this show. One of them these systems was for solving the problem of the drum sticks. Galantis performs live by drumming on their Roland SPD sampling pads. We had the motion capture of the bodies down using XSens, but it proved too difficult to track the drum sticks with the xsens accelerometer trackers given the size, speed and impact of the drumstick’s movement. However, we still needed to visualize the drum sticks, and we couldn’t have them be always stuck to the hands of the avatar if the live performer wasn’t actually holding them, as it could look awkward or strange to see a stick attached to an open palm for example.

So I came up with a solution that utilized the MIDI that was already being sent out of their Roland sampling pads. When a MIDI note was detected within a certain range and channel, the virtual drum sticks of the corresponding performer animated on, and would automatically fade out after ~1 second or so. This ensured the sticks would only be visible when they were playing. For the situations where they were holding the sticks and not playing (like when posing with their fists in the air holding the sticks), there was a manual knob that could override the visibility of the sticks. Given the constraints of this production (including not being able to coordinate with the artists until day of the show itself), this turned out to be a good solution overall.

The other system I designed and built was a method for animating the movement of a crowd of avatars. Taking lots of individual animations of people dancing that we captured with our xsens suit, I quickly made a system in Unity to scatter and offset these animations among a set of humanoid avatars, and devised a way to trigger on the Unity timeline different “energy levels” of movement (low, mid and high), and the system would transition between these animations and make it look like a crowd responding properly to the current energy of the music

Below is a recording of a re-broadcast of the show we did shortly after the Covid-19 pandemic began to help raise money for artists affected by the lockdown.