HEAT – Experimentations in Live Virtual Opera Performance

Last February, INO, along with our partners at Dublin City University, made our first tests using the Hybrid Extended Reality technologies in development by the EU funded research and innovation project HEAT.
This project is designed to explore emerging holographic technologies and how they can enhance the audience experience of opera. This will be explored both in terms of incorporating holographic technologies into the live staging of opera and the use of XR technologies to facilitate engagement in a live opera performance by remote audience members.
Focussing first on the remote audience experience, we brought one singer, one dancer and one pianist together in a room full of technology to learn how we can use these immersive tools to connect 2 performers in a virtual space.
Director Jo Mangan was on hand to devise creative workflows based around a scene from Donizetti’s Lucia di Lamermoor. This would allow soprano Ami Hewitt, accompanied on piano by Aoife Moran, to perform live in virtual reality with dancer Stephanie Dufresne. We used a volumetric capture set up in two different rooms to capture each performance. 3 cameras were trained on each performer in a circular set up, with the performer in the middle, giving us a full 360 perspective.
The two volumetric renderings were each streamed live into the Holomit platform in virtual reality. Holomit is a communications platform being developed as part of the HEAT project by consortium partners i2CAT. It is a Unity based application that allows for the transmission of real time 3D volumetric video in virtual and augmented reality environments, effectively a 3D virtual video conferencing tool that allows users to see 3D reconstructions of other participants in the virtual space.
The audio of the soprano was captured via a simple microphone set up and transmitted as a separate feed into Holomit. Further tests will look at capturing the audio spatially and placing it in the virtual space to enhance the fidelity of the experience and give the illusion that the audio is coming directly from the live virtual singer.
This process allowed us to begin to develop effective workflows and processes to combine traditional opera staging with virtual production set ups - the first step in developing new forms of artistic expression for the virtual stage.
Many thanks to Anderson Simiscuka, Syed Mohammad Haseeb and the team at DCU for their technical facilitation for these experiments.
Stay tuned as we continue to explore the HEAT technologies in advance of our full opera pilot in spring of 2027.
