live protocol

The Next Generation of Live Shows

At S9, we teamed up with Alberto Ramirez to bring a bold idea to life: driving CG content for live shows in real-time while giving full control of the design through a light console using DMX protocols.

By combining our expertise in Virtual Production Design and a deep Unreal Engine tech background, we created a mind-bending optical illusion—a virtual set extension that seamlessly replicates real-world light fixtures in a virtual environment.

3D Asset Creation

  • Crafted a detailed 3D environment designed to create an optical illusion by resembling real-life light fixtures, seamlessly blending virtual and physical elements to enhance immersion.

  • Designed and integrated a robotic arm, showcasing S9's expertise in character animation to add dynamic movement and interactive storytelling to the prototype.

  • Ensured all assets were optimized for real-time rendering in Unreal Engine 5.5, maintaining exceptional performance and visual fidelity.

Blueprint

  • Built a versatile blueprint system leveraging DMX protocols, enabling seamless integration with industry-standard light consoles like GrandMA. This allows real-time control and programming for any live show or interactive installation.

  • Integrated NewTek NDI, curating the system for live video feeds, enabling interaction with any video feed distributed across the network for dynamic, real-time visuals.

  • Designed dynamic shaders to drive DMX-controlled animations, allowing the lighting and visuals to adapt in real time to external triggers and create a seamless immersive experience.

Where Real-Time Meets Traditional Content

  • This feature seamlessly blends pre-rendered content with real-time adaptability, offering unparalleled flexibility for live productions. Textures can be used to drive light fixture behaviours, or content can be displayed directly on virtual LED surfaces, creating dynamic visuals that integrate effortlessly into any show design.

Animation Sequence

  • Designed the system to seamlessly work with pre-established animated patterns or sequences, providing a robust queue-based or timeline-driven workflow.

  • Enabled easy integration of pre-programmed animations, allowing for smooth transitions and dynamic adaptability during live performances.

Offline and Real-Time Render Output

  • Delivered cinematic-quality visuals through offline rendering with RenderQueue and ensured seamless interactivity with real-time outputs in Unreal Engine 5.5 for live shows.

  • Optimized both workflows to achieve a perfect balance of performance and visual fidelity, providing versatility for high-impact presentations and dynamic live experiences.

team

  • Raul Baeza

    Raul, an experienced CG Artist and Supervisor, excels in Animation, Previsualization, and Motion Graphics, creating captivating visuals for Advertising, Films, and Shows through collaboration and storytelling.
    Raul contributed to the project by designing and creating environments, assets, and animations, all integrated into the prototype to be triggered seamlessly within Unreal.

  • Alexander Debavelaere

    Also known by his VJ alias Alex Vlair, Alex brings over 10 years of experience creating immersive visuals for events and advertising. His work targets the music industry, specializing in real-time 3D content using Notch and Unreal Engine.

    Alex applied his knowledge of real-time work flows and live-show protocols to establish the framework for this demonstration, as well as assisting with technical implementation and execution.

  • Eddy Chan

    Eddy is a veteran VFX artist with over 20 years of experience creating visual content for advertising, immersive experiences, and cinematic productions. He has worked with top studios like Rodeo FX and BLVD, specializing in CG lighting, compositing, and real-time rendering.

    For this prototype, Eddy developed the blueprint connecting DMX protocols to control lights and animations, as well as linking DMX with dynamic shaders. His technical expertise ensures seamless real-time interactivity and visually adaptive results, showcasing the potential of this cutting-edge solution.