At Reactional Music, our participation at the Unite 2023 Unity Conference in Amsterdam marked a significant step in demonstrating the full practical applications of our technology in the gaming world. As sponsors and participants, we showcased how our innovative Reactional Engine can integrate with gaming environments and how generative music can impact game development.
On the last night a thousand games developers came together for a party to celebrate Unite 2023. Reactional partnered with Unity to create a special event that for the first time anywhere was able to combine live gameplay with a live DJ and a light show driven by the gameplay. All triggered and synced in realtime by the Reactional Engine.
In this, the first of a two part insight into the partnership with Unity, Reactional Music’s chief product officer Jonas Kjellberg shares the experience, the integration process and impact of our technology on enhancing gaming experiences with responsive and dynamic music.…
To open the Unite event in front of a thousand games developers we created a live demo based on the Unity Spaceship scene. This project, initially a live showcase for Unity’s Visual Effects graph system, was transformed by our technology to sync in-game elements with a live DJ performance by DJ FELICIA. The Reactional Engine is also able to send in-game data to a lighting desk, triggering live lighting effects.
DJ integration
The spaceship scene marked a grand and impressive start to the party.
As part of her DJ set, FELICIA was the one triggering the start of the game scene and music, allowing it to seamlessly blend into her live set. This was done over OSC using QLab.
Reactional performs music analysis on any music it’s fed, creating a unique experience where the game scene can react to the beat and rhythm of the music. This meant that every beat, note, and melody of the DJ set could be reflected in the visuals of the spaceship scene, creating an immersive audio-visual experience for the party-goers.
All synchronisation between music and game scene happens live. Nothing is pre-baked.
Directly after the spaceship scene, the night continued with an electrifying DJ set from FELICIA, keeping the energy high and the party in full swing.
Beat clock and shader events
The Reactional beat clock played a central role in triggering and synchronizing shader events, which included the leaky steam pipes that punctuated the scene with steam bursts on beat, along with electrical sparks timed to the rhythm.
Shader changes and visual effects
The beat clock also drove gradual shader changes, creating a dynamic, responsive environment that moved with the music. The holographic table, for instance, undulated in time with the beat clock, creating a mesmerising visual effect.
Transformation and animation linked to the beat
The transformation of shaders and objects was linked to the beat. Depending on the current beat modulo and the frequency requested, shaders and objects would scale their transform, creating an evolving, organic scene in tune with the music. The beat clock progression combined with animation curves also served to animate lights, adding another layer of dynamic visuals to the scene.
Interactive elements
Other elements were also able to be locked to the beat, such as the alarm blaring at the end of the scene and the shaking of the ship. The realtime ability to impact game scenes through music in real time gives a game and digital world the feel that their is a live music score that has been pre-baked with the scene, much like we see in films. Nothing could be further from the truth. Reactional’s Engine is able to achieve all this in real time, enabling any piece of music to be brought in and impact the scene and its visuals.
The gameplay demo, the live music from DJ FELICIA and the data fed into the lighting desk, creating an incredible audio and visual festival and enhanced the connection between the music and the game environment, creating an immersive and highly engaging experience for everyone who attended.
Lights controlled by game triggers
We had employed a fantastic lighting designer – Leo Stenbeck. Leo has worked extensively in live settings in music and theatre and was also involved in Riot’s live esports events. He designed the layouts of the lights and FX given the rig specifications and how the lights would follow the music being played.
As most of the visual cues were locked to the Reactional beat clock, we came to realise that we could use the game triggers directly to control the light desk. As such we created a system where we would use a network protocol called OSC to trigger macros on the GrandMA3 lighting desk. A completely new capability in live gaming events which we believe will have a significant role to play in the future of live esports events.
For every visual element in the scene a message was transmitted containing the game object type, the object transform, camera transform and the angle of the viewport compared to the object transform. This way we could compute whether a yellow light in the game scene was flashing behind the camera for instance. With this we aimed to extend the visual impact of the spaceship deep into the venue.
We used a similar OSC process to trigger FX including CO2 jets and spark showers on the main stage.
Next song
Since the game scene reacts live to any music being played in the Reactional Track format, you could choose any track for and still have the same ultra-tight musical experience with visuals following.
We then employed a games scene and created a challenge for developers at Unite to then take part in a games challenge. Only with this challenge, the gameplay was synced with the live DJ set and sent live data to the lighting desk, meaning that everything was in sync.
Stay tuned for part 2 in the Unite afterparty case study to learn more.
About the author
Jonas Kjellberg is a games music composer and CPO at Reactional Music.
.