Weta Digital and 2n Design Dazzle PHAROS Festival with Interactive Graphics
Attendees at the PHAROS festival, a three-day event in November in Auckland, New Zealand that combined music, visual spectacles and immersive effects, experienced three shows by singer Childish Gambino, performed over three nights. His concerts were held inside a 160-foot-tall dome, holding 3,000 people. Visuals were projected in a 360-degree view to support the stage design, with Childish Gambino (aka Donald Glover) performing at the centre.
When audiences looked up, they saw a ‘sky’ full of characters moving to the music and colourful scenery lighting up the dome. For this out-sized project, studio 2n Design from New York and Weta Digital together designed and integrated interactive, animated graphics into the show, styling a different visual world for each song Donald performed at the concert.
The size of the dome meant that a huge number of pixels would be needed on the screen. “We knew we could make a high-quality result, particularly if we went with the Unreal Engine to develop our system architecture,” said Kevin Romond, Special Projects Supervisor at Weta Digital. “It has so many tools related to image quality that we knew we could really make use of for this project.”
VR Review
In the graphic design stages, to make sure that the visuals were effective and engaging from all points of view and angles inside the dome, the two teams first developed the content within a VR implementation. This allowed them to review and iterate the designs before setting up the system.
“Creating visual narratives that were interactive and could respond to Donald’s performance – as well as audience mood and energy – was a great opportunity,” said Keith Miller, VFX supervisor at Weta Digital. “We were able to art-direct movements inside the dome that made every vantage point feel special, and we were able to preview that environment by working in a real-time VR workspace.”
“VR helped us understand what was happening inside the 360-degree design,” said Alejandro Crawford, creative director for PHAROS at 2n Design. “We could tell if a tree was too close to the camera, or if a creature needed to move a different way. VR helped us compose the shots and choreography the way we wanted.”
Austion Woolfolk at 2n Design sets out the display system architecture for the show.
Real-Time Synchronisation
Once the visuals were ready, the team had to synchronise the graphics to the performance and make sure the entire system was rendering in sync. To display such large imagery across the dome’s surface, multiple projectors were required. The display system was made up of five machines - each responsible for a region of the dome.
The team found they could accurately render the required kind of output in nDisplay, the Unreal Engine utility for creating a seamless visual of a single scene from multiple projections in real time. With nDisplay, 2n and Weta could orchestrate the five machines into one continuous circular render during the live production using NVIDIA Quadro P6000 graphics.
The team calculated that they would need a 5.4K by 5.4K image rendered by the five machines, frame-locked and powered by the Quadro P6000 graphics, then split into a fisheye projection that would keep portions of the image in proportion to one another, and sent to 12 projectors. “Using nDisplay to distribute the real-time rendering, and keep the resolution and the frame rate where it needed to be for this project, was critical to the success of the project,” said Jeremy Thompson, Art Director at 2n.
Synchronisation of an application that runs on multiple displays involves frame lock synchronisation and swap synchronisation. Frame lock uses hardware to synchronise the frames on each display and to redraw to multiple displays at the same time. When an application is displayed across multiple monitors – here, the multiple regions of the dome - frame-locked systems help maintain image continuity to create a virtual canvas. When several systems are connected, a sync signal is fed from a master system to the other systems in the network, and the displays are synchronised with each other.
Swap sync refers to synchronising buffer swaps from multiple application windows. This includes the ability to have frame-accurate synchronised displays. Using swap sync, applications running on multiple systems can synchronise the application buffer swaps between all the systems.
Austion Woolfolk and Alejandro Crawford with the NVIDIA Quadro P6000 GPUs.
Transitions – One World to Another
“We wanted invisible transitions from one world to another, as opposed to fading in and out. Waiting for levels to load was not an option,” said Alejandro. “Our strategy was to load everything into memory, and the Quadro GPUs had enough video texture memory for this to work.” The team then built a custom engine to receive content from Unreal and, using nDisplay, individual feeds were stitched in real time, producing beautiful projections at over 5K resolution.
Audio reactive parameters were designed and mapped to the system so the visuals were influenced by Childish Gambino’s performance as well. Austion Woolfolk, Video Designer at 2n, operated the visuals in real time during the concert, interacting with Unreal Engine via a MIDI controller to make the graphics’ timing gel perfectly with the music and other VFX. ”Donald Glover really wanted to bring the audience into this world and create an experience that everyone has together, every single piece ebbing and flowing, as if we're going on a journey together,” he said.
The result was an immersive music event with spectacular graphics projected throughout the dome, all reacting to the live event. Combined with custom lighting, smoke and a laser show, projected onto the dome’s interior in real time, the visual experience earned the team a VES Award for Outstanding Visual Effects in a Special Venue Project in February 2019. www.unrealengine.com