ChamSys in control on The Other Side
Monday, 18 January 2021
the-other-sideOn The Other Side with Jean Michel Jarre
France - Released on New Year’s Eve, Jean Michel Jarre’s virtual reality show, Welcome To The Other Side, is ‘a mind-expanding experience’ that garnered 900,000 views on YouTube within a week of its release and, according to Sony Music International, over 75m views on all outlets (Facebook, VRC, Weibo, Tik Tok). The work mixes high-tech sights and sounds around a reimagined (and virtual) nine-centuries-old Notre Dame cathedral.
During its 55-minute run time, conceived to celebrate the arrival of 2021 in the French capital, animated geometric forms rise and fall within the historic church’s nave, keyboard instruments melt with colour, brilliant light patterns run up stone columns to play off stained glass windows.
Moving seamlessly with Jarre’s music in Welcome To The Other Side is a multifarious lightshow created by Jvan Morandi of Placing Shadows using his ChamSys MagicQ MQ80 console as the starting point to merge show business with game engine workflows. Run along triggered timelines within the Unity game engine, the lightshow is divided into two parts: interior stage element and architecture: sequences created via the ChamSys but then run directly by the Unity engine a series of exterior architectural sequences, involving lights and lasers aa well as camera moves played back via the lighting desk thought ArtNet and software.
“We programmed the cues with a ChamSys MQ80 in my studio,” notes Morandi. “All the cue lists come from ChamSys and were translated into a set of Unity animation triggers on a timeline. When I say ‘translated’, I mean that Victor Pukhov used the visualised lighting cues to create shader animation that then got triggered by Unity custom scripts by Antony Vitillo.
Morandi credits his MQ80 with helping this process go smoothly. “The Copy linked features in the ChamSys console were very useful, as was the Off-Set patch,” he said. “With so many camera shots on the outside I needed to dress the shot and fill it differently depending on the situation. Also, the ability to link my desk directly to my software and transfer data between the two, (they patch automatically in ChamSys) was a big help in creating the shots quickly.”
During the show, Jarre played live from TV Studio Gabriel in Paris. He was lit only by a video projector and portrayed in vivid colours and shapes coming directly from the same video content that was UV mapped on the inside of the virtual cathedral. Lending another evocative touch to the show was the virtual rendition of Notre Dame’s interior. The 3D model and the game engine programming were optimised by Lapo Germasi and Victor Pukhov of Manifattura Italiana Design. “Once we received the interior of the cathedral, we worked with our studio software and the game engine to find the right looks,” explained Morandi.
Collaboration was critical to making this VR creation come to fruition. “This project involved a great many very creative people coming together from diverse backgrounds, starting with Jean Michel Jarre whose vision and hard work made it all possible,” said Morandi. “Credit should also go to Louis Caracciolo from VRroom, our French VR Producer; and Antony Vitillo of NTW (Italian developers that looked after all the scripting and game engine functionalities). Jonathan Klahr did an amazing job on the 2D video content mapped onto the interior walls. Stephan and Jeroen from LaserImage of Amsterdam programmed the initial laser sequences. Georgy Molotsdov, Maud Clavier, David Montagne (global tv broadcast) did a great job filming the show all in VR.”
Exemplifying the scope of the project, one camera director was in Moscow (Georgy Molotsdov), while another was in Paris (Maud Clavier). Each of them controlled up to eight remote VR cameras and drones. Filming of this live VR gig was completely in VR within the VRchat platform.

Latest Issue. . .

Tweets from our Friends