Out of the Ordinary | Exploring Motion Capture

Last year, Out of the Ordinary director Jo Mangan and soprano Naomi Louisa O’Connell spent a day with Virtual Reality Ireland to explore the potential that motion capture technology holds for capturing opera performance for immersive media.
The motivation for these tests was to give Jo a sense of the possibilities of the Rokoko motion capture technology and how this technology might be useful in the creation of our VR Community Opera Out of the Ordinary.
Using the micro opera TOUCH (an opera created specifically for film composed by Karen Power for INO’s #20ShotsofOpera project last year) as a test case, we created a short, immersive, animated version of the opera using Rokoko motion capture technology and Unreal. The performance was captured and created by Aisling Phelan and the team at Virtual Reality Ireland.
Initially, the 2D projection graphics from TOUCH, designed by Luca Truffarelli, were imported into the Unreal engine and the team at VRI worked with the video content to create video textures in Unreal for the immersive test. A number of pre-existing avatars were selected to layer over the motion capture data we would create, offering character options for the animation. The recorded audio track from TOUCH was used as a soundtrack and we arranged a playback track for the motion capture so the movements would reflect the original performance as closely as possible.
We invited Naomi Louisa O’Connell, one of the performers in the original opera, to capture her performance using the motion capture suit. The body motion was captured via a Rokoko suit, worn by Naomi, and Rokoko Studio software. Jo directed Naomi for the physical performance aspects, along to the playback of the music. There was a spirit of experimentation and play to the session, working with happy accidents like Naomi’s avatar slipping through the plain of the virtual floor, essentially disappearing from the virtual scene, an interesting example of the opportunities that artists can find in what otherwise might be seen as glitches or problems in the technology.
Once we had several experiments of Naomi physically performing to the piece, it was time to capture her facial movements. This was carried out using the Apple iPad pro AR face tracking software. Then it was time for the VRI team to bring all of the captured data together in the Unreal Engine - character, movement, facial capture, the TOUCH video environment and of course, the sound.
First, they exported Naomi’s facial and body motion captures. The motion capture data was cleaned up before retargeting all of Naomi’s motion captures from Rokoko studio software onto a test avatar 3d model character. This was then imported into a scene in Unreal Engine.
To create video exports, they brought a virtual camera into the scene and created some recorded sequences, matched with the TOUCH opera audio to give a sense of what a final piece could potentially look like. Naomi’s animations were also retargeted onto another rigged character to see how movements might differ from one 3d model/character to another.
Here's an example of how the motion capture and mapping all came together at the end of the day’s session. The session offered a real sense of the powerful potential of this technology for capturing real world performances and movement for presentation in immersive spaces. Over the next few months we will continue to experiment with VRI to unlock the power of VR and immersive media for opera performance.
TOUCH performed by Naomi Louisa O’Connell and Gyula Nagy
Composed by Karen Power
Words by Ione
Directed by Jo Mangan
Video design by Luca Truffarelli
Immersive Media Creation by Aisling Phelan and Virtual Reality Ireland