DMW Awards Partner

Animal Logic recently released a trailer for ‘Unhinged’, a short film their team created in Unreal Engine with assets from ALab, the company’s open-source USD scene project.

AWS animal logic

Animation and VFX studio Animal Logic recently released a trailer for ‘Unhinged’, a short film their team created in Unreal Engine with assets from ALab, the company’s open-source Universal Scene Description (USD) project.

Animal Logic’s ALab is an entire production scene that features more than 300 detailed, photoreal USD assets, including two rigged and animated characters, Goggles and Hinge, all designed by in-house talent. The studio developed the scene for people, from the education sector and the wider creative community, who are interested in how USD works in production. They can use the assets and scene to explore and experiment with.

Available as a free download for creators to use in their own workflows, ALab ‘Phase 2’ of the project was released in August 2022 and is now integrated into the Academy Software Foundation Digital Production Example Library. The ALab data set is supplied as three separate downloads -- the full production scene, high-quality textures, and baked procedural fur and fabric for the animated characters

Unhinging USD Assets for Real-time Production

Showing off Animal Logic’s unique style and playful storytelling approach, ‘Unhinged’ centres on the adventures of manic ferret Dr. Goggles McPherretson and neurotic robot Hinge in an abandoned workshop environment. To produce the story, the production team built a cloud workflow using AWS, where they could work in a Windows-based Unreal Engine pipeline, confined exclusively to cloud-based workstations, without impacting their local Linux-based production pipeline.

AWS ALab 1

“‘Unhinged’ is the culmination of research we’ve been doing with Unreal Engine, and we wanted to experiment with using it in a real production to see how it would impact our creative process,” said Darin Grant, CTO at Animal Logic. “But, it made far more sense to launch workstations in the cloud than displace physical production workstations, especially with the ongoing supply chain issues.”

The primary project goals were to understand how USD production assets would hold up in a real-time environment while exploring early storytelling and character performance potential. The team also wanted to experiment with using Unreal Engine for virtual production on an animated project, which meant using effects, dynamics and compositing tools in-engine while operating in the cloud.

Separate Network

Building a new cloud-based infrastructure from the ground up, the team created a separate network to support this Windows-based Unreal workflow, using Amazon Nimble Studio, a set of studio building services. Teams bring together virtual workstations, storage and render farms in the cloud, matching their particular use case, using the StudioBuilder wizard to help choose the right combination of EC2 G4 instances, shared storage file systems (Amazon FSx) and Amazon Machine Images (AMIs).

To accommodate different use cases, including previsualisation, motion and performance capture, and real-time review, Animal Logic built three different AMIs, which are descriptions and sets of instructions used to launch an instance, or cloud server resource, in AWS. These AMIs had varying GPU power, and artists chose the appropriate virtual machine based on the assigned task for the week and streamed it to their end device using NICE DCV.

AWS ALab 3

Display Protocol

NICE DCV is the remote display protocol that delivers remote desktops and application streaming from cloud or data centre hardware to local devices – securely over varying network conditions. NICE DCV is what makes EC2 instances a viable alternative for users running graphics-intensive applications remotely. By streaming their user interface to simpler client machines, they can avoid relying on expensive dedicated workstations.

“Using virtual workstations with NICE DCV, we’ve had zero detectable lag, even when we used it on an airplane, as I discovered, which was an amazing feature, especially on a real-time project,” said Animal Logic VFX Supervisor Stephanie Pocklington. “From there, artists would load assets through Perforce [version control software for developers] and then begin work in Unreal Engine.

"The whole process of connecting to a pre-launched machine and getting straight to work took 15 minutes on average. As a supervisor, it was incredible to be able to distribute a variety of powerful custom virtual machines to artists, loaded with our particular version of software, almost instantaneously. Normally, you may have to wait several months to get comparable physical machines in your studio.”   aws.amazon.com