Dream was inspired by A Midsummer Night’s Dream (photo: Stuart Martin / RSC)
UK - Dream is a multi-layered mix of movement, music, visuals, cool technology and narrative magic, an immersive digital performance fusing the drama of Shakespeare with the dynamic worlds of gaming and theatre in a production by the UK’s Royal Shakespeare Company in collaboration with Manchester International Festival, Marshmallow Laser Feast, and the Philharmonia Orchestra (read a full production report in the upcoming issue of Lsi, out early May).
Staged physically in the Studio at Portsmouth Guildhall and inspired by the classic A Midsummer Night’s Dream, the characters were played by six real actors utilising Vicon motion capture cameras, their avatars and effects appearing onscreen - centring around the antics of cocky and capricious mischief-maker Puck. They run amok in the virtual Midsummer forest, hellraising and spoofing four other sprites during a disruptive and chaotic journey.
Matt Peel was responsible for lighting design and show control. He utilised the power - and specifically the OSC (open sound control) and DMX remote triggering capabilities - of his grandMA3 system with the new grandMA3 software.
The show was broadcast live for 10 evenings and enjoyed by thousands worldwide, who logged in, either paying for an interactive ticket - with the chance to shoot fireflies into the story to help illuminate Puck’s pathway through the forest - or simply watch the performance for free.
Dream was originally intended to be an in-person performance during 2020, but due to the pandemic, was adapted and re-worked as a live performance concept to be fully enjoyed remotely.
As part of the R&D phase of the project, Matt explains that working closely with the RSC’s Daniel Orchard, an Unreal engine developer, they integrated known event technologies to communicate with the virtual and real-world show control systems.
An MVR (My Virtual Rig) importer developed for Unreal enabled the Vectorworks pipeline for real-world lighting to be received by both Unreal as well as natively in grandMA3; OSC was integrated into Arduino powered proximity sensors to communicate the status of the physical world; and a PosiStageNet (PSN) - 3D live positional data protocol - plugin for Unreal enabled the grandMA3 and Unreal (which already has inbuilt DMX and OSC) to communicate bi-directionally.
By building custom grandMA3 fixture-types, certain aspects in the game world could be controlled via DMX, such as the height and brightness of an object like the sun, or the colour of an avatar. Using grandMA3, all these elements could be rapidly tweaked in real-time on the console.
“Using the grandMA3 in this way meant we could work really fast in this context to make adjustments to these game effects, rather than using game engine keyframing which is a lot more time consuming jumping Unreal in and out of ‘editor’ mode,” says Matt.
Matt explained that a small rig of traditional theatre lighting - in the form of eight moving lights - in the real studio motion capture volume (capture space), assisted in directing the actors to respond to interactions from the remote audience.
In addition to orchestral music by composer Jesper Nordin and Philharmonia Orchestra principal conductor and artistic advisor Esa Pekka-Salonen, at strategic points the actors’ movement was fed into Gestrument software (a Jesper Nordin project) allowing them to ‘play’ digital instruments and interact together via their motion.
The overall show was cued and run in traditional theatre style by a DSM - a synergetic mix of innovative technologies and well-respected techniques that pushed several boundaries.
Dream was directed by Robin McNicholas of Marshmallow Laser Feast, Sarah Perry was the movement director, and the project lead and executive producer was Sarah Ellis, director of digital development at the RSC. Ambersphere Solutions is the exclusive distributor of MA Lighting in the UK.

Latest Issue. . .