Digital Domain Welcomes Gabrielle Gourrier as Executive Vice President of VFX
Digital Domain appointed Gabrielle Gourrier as Executive Vice President of VFX Business. In this role, Gabby will focus on expanding client partnerships and establishing strategic initiatives.
Chaos V-Ray 7 for 3ds Max brings Gaussian Splat support for fast photoreal environments, new ways to create interactive virtual tours and more realistic, controllable lighting to 3D rendering.
VFX Supervisor Morgan McDermott at Impossible Objects talks about making their immersive film for UFC’s debut at the Sphere, combining heros and symbols of Mexican independence with UFC legends.
Chaos puts Project Arena’s Virtual Production tools to the test in a new short film, achieving accurate ICVFX with real-time raytracing and compositing. Christopher Nichols shares insights.
Moving Picture Company (MPC)has appointed Lucinda Keeler as Head of Production for its London studio, bringing over 20 years of experience and leadership in the VFX industry.
REALTIME studio has launched a Virtual Production division, following its grant from Media City Immersive Technologies Innovation Hub to develop a proprietary Virtual Production tool.
ZibraVDB plugin for Virtual Production and CG studios delivers high compression rates and fast render times, making it possible to work with very large volumetric effects in real-time.
Maxon One 2025 updates Cinema 4D, Redshift, Red Giant and Cineware, and releases ZBrush for iPad, putting ZBrush sculpting tools into a mobile device with a new UI and touch controls.
Das Element asset library software version 2.1 has new video playback controls, hierarchy tree customisation for libraries, faster set-up processes and simpler element migration.
Autodesk returned to SIGGRAPH 2024 to show software updates that include generative AI and cloud workflows for 3D animation in Maya, production scheduling and clip retiming in Flame.
Shutterstock launched a cloud-based generative 3D API, built on NVIDIA Edify AI architecture, trained on licensed Shutterstock content, as a fast way to produce realistic 3D models with AI.
Freefolk has promoted Rob Sheridan to VFX Supervisor in their Film and Episodic division and Paul Wight is now the company’s first Chief Operating Officer.
DNEG is increasing its Canada-based VFX and animation operations and opening a new studio in the Greater Toronto Area. The company’s fourth North American facility, joining studios in Montreal, Vancouver and LA, will initially employ up to 200 people, in particular new technology positions, as well as VFX and animation roles. The new emplyees will initially work remotely, with the search for a physical studio location in the Greater Toronto Area now underway.
The opening of the Toronto studio is part of a wider expansion across Canada. They also plan to expand their existing Vancouver and Montreal offices with up to 300 new roles, including up to 100 new positions for our DNEG Animation team as it opens its doors in Vancouver.
People interested in joining the teams in Canada can find these new opportunities and job postings on the careers page here.
Commenting on the expansion, CEO Namit Malhotra said, “We are further investing in Canada by creating up to 200 new jobs in the Greater Toronto Area with the opening of our fourth North American studio. We are also continuing to build out our studios in Vancouver and Montreal to support our upcoming slate of VFX projects, and extending our successful DNEG Animation team to Vancouver as they move into production on five new feature animation projects. Growth in our Canadian talent and capabilities will help us strategically align with the demands of the entertainment industry and seize our new growth initiatives and content creation opportunities.”
DNEG’s Global CTO Paul Salvini said, “I’m excited that DNEG is bringing these new opportunities to the visual effects and technology communities of the Greater Toronto Area. As a resident of Kitchener / Waterloo myself, I’m well aware of the strength of the technology talent in this area. This is a great chance for technologists working in AI, machine learning, UX and across a broad spectrum of technology areas to refocus their talents on helping to create incredible imagery for some of the world’s biggest feature films and episodic series. We’re not fixed on candidates having previous film or media industry experience – we’re looking for passionate and curious technologists who are excited at the prospect of a new challenge.” www.dneg.com
Blender is becoming a standard 3D software application and, according to AMD, needs to be able to work in a larger workflow of other applications. Pixar’s Universal Scene Description (USD) has made exchanging data between 3D applications easier for artists, opening a robust, open way to exchange and assemble data from multiple applications. AMD says that Blender users should also be able to experience the same ease of use and dedicated experience with USD.
Blender USD Hydra
AMD has launched a project enabling USD data assembly and rendering inside of Blender. Brian Savery, Professional Graphics Software Development Manager for AMD said, “Blender includes a basic USD exporter, and soon will include import tools. However, there is no method of rendering existing USD data within Blender or referencing a USD file into your Blender scene. Other tools that support USD, such as SideFX Houdini or Autodesk Maya, also allow assembly and manipulation of USD data.
“Furthermore, while Blender users create intricate shader networks for its Cycles path-tracing render engine, they need a way to share shading networks with other applications. USD includes a rendering system called Hydra that allows multiple renderers with one common interface. AMD adapted this system to work directly inside Blender. By adapting Hydra as a render add-on to Blender, any renderer that supports Hydra can be connected to Blender by plugging into the Hydra add-on.”
Also, the Hydra system sends scene updates and rendering very quickly, which leads to better renderer performance than using common rendering add-ons to Blender. Currently this add-on includes the Hydra OpenGL renderer and the AMD Radeon ProRender plug-in for Hydra, though other Hydra render delegates should work equally well.
USD Scene Composition
Another important aspect of USD support is enabling USD scene composition in Blender. AMD achieves this with a custom node graph, allowing users to pull in external data to mix with Blender data and filter, manipulate and export USD data. This allows tools for pruning data, referencing data without loading it into Blender’s memory, interaction between multiple artists, and exporting composited scenes for offline rendering.
Similar to USD geometry, AMD handles materials using the open source MaterialX open standard. Created by Industrial Light and Magic for sharing material graphs across renderers, it is quickly gaining acceptance as the standard material format. This makes it possible to add material node graphs from Adobe Substance 3D Painter and various Autodesk applications, as well as export them.
MaterialX is a growing standard with increasing adoption across applications. To help encourage adoption, AMD plan to launch a free Material Library for sharing MaterialX materials on AMD’s GPUOpen.com. Users will be able to use it to download materials and import directly to the Blender Hydra plug-in.
The video linked here is presented by the author of the AMD USD plug-in for Blender as an overview of the main features. www.amd.com
Unreal Engine 4.27 is now available with updates supporting filmmakers, broadcasters, game developers, architectural visualisation artists, and automotive and product designers.
In-camera VFX
The use of in-camera VFX is now more efficient, with results of a quality suitable for wider applications such as broadcast and live events.
Designing set-ups in nDisplay, Unreal’s tool for LED volumes and rendering to multiple displays, is simpler to manage due to a new 3D Config Editor. All nDisplay-related features and settings are placed in a single nDisplay Root Actor to make them easier to access. Setting up projects with multiple cameras is also easier.
nDisplay now supports OpenColorIO, improving the accuracy of the colour calibration, which associates content creation in Unreal Engine with what the physical camera captures from the LED volume.
For efficient scaling in nDisplay, multiple GPUs are supported. This also makes it possible to make the most of resolution on wide shots by dedicating a GPU for in-camera pixels, and to shoot with multiple cameras, each with its own uniquely tracked field-of-view.
A new drag-and-drop remote control web UI builder is now available to help build complex web widgets without writing code. This makes it possible for users without Unreal Engine experience to control their results from the engine on a tablet or laptop.
Camera Control
Also, the Virtual Camera system built for Unreal Engine 4.26 now includes Multi-User Editing, a redesigned user experience and an extensible core architecture – that is, it can be extended with new functionality without modifying the original codebase. A new iOS app, Live Link Vcam, is available for virtual camera control – users can drive a Cine Camera inside Unreal Engine using a tablet or other device.
A new Level Snapshots function will save the state of a given scene and later restore any or all of its elements, for pickup shots or as part of an iteration phase. Users also have more flexibility when producing correct motion blur for travelling shots that accounts for the look a physical camera would have with a moving background.
Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test all of these in-camera VFX tools by making a short test piece following a production workflow.
USD, Alembic and Workflow Connections
With this release, it’s now possible to export a bigger variety of elements to USD, including Levels, Sublevels, Landscape, Foliage and animation sequences, and to import materials as MDL nodes. You can now also edit USD attributes from the USD Stage Editor, including through Multi-User Editing, and bind hair and fur Grooms to GeometryCache data imported from Alembic.
Datasmith is Unreal’s set of tools for importing data from various sources. In 4.27, Datasmith Runtime allows more control over how the data is imported, including access to the scene hierarchy and the ability to import .udatasmith data into a packaged application built on Unreal Engine such as the Twinmotion real-time architectural visualisation tool, or a custom real-time design review tool.
A new Archicad Exporter plugin with Direct Link functionality is available, and Direct Link has been added to the existing Rhino and SketchUp Pro plugins. Datasmith Direct Link maintains a live connection between a source DCC tool and an Unreal Engine-based application for simpler iteration. You can also aggregate data from several sources, such as Revit and Rhino, while maintaining links with each DCC tool simultaneously.
GPU Light Baking
Unreal Engine’s GPU Lightmass uses the GPU instead of CPU to progressively render pre-computed lightmaps, using new ray tracing capabilities of the DirectX 12 (DX12) and Microsoft's DXR framework. It was developed to reduce the time needed to generate lighting data for scenes that require global illumination, soft shadows and other complex lighting effects that are expensive to render in real time.
Also, since the results can be seen progressively, the workflow becomes interactive. Users can stop, make changes and start over without waiting for the final bake. For in-camera VFX and other work, GPU Lightmass means that virtual set lighting can be modified much faster than before, for efficiency.
VR, AR and Mixed Reality
Support for the OpenXR framework, ready for production, is now added to make creating extended reality content – VR, AR and mixed reality – in Unreal Engine easier. OpenXR simplifies and unifies AR/VR software development, so that applications can be used on a wider variety of hardware platforms without having to port or re-write code, and compliant devices can access more applications.
The Unreal OpenXR plugin allows users to target multiple XR devices with the same API. It now supports Stereo Layers, Splash Screens, querying Playspace bounds to determine what coordinate space to play the camera animation relative to. Extension plugins from the Marketplace are available to add functionality to OpenXR without waiting for new game engine releases. The VR and AR templates have a new design with more features built-in and faster project set-up functionality.
Containers in the Cloud
Epic Games has continued to develop Pixel Streaming, which is now ready for production and has an upgraded version of WebRTC. It enables Unreal Engine, and applications built on it, to run on a cloud virtual machine and to allow end users, anywhere on a regular web browser, to use it as normal on any device. 4.27 also has Linux support and the ability to run Pixel Streaming from a container environment.
This new support for containers on Windows and Linux means that Unreal Engine can act as a self-contained, foundational technical layer. Containers are packages of software that encompass all of the necessary elements to run in any environment, including the cloud.
Container support includes new cloud-based development workflows and deployment strategies, such as AI/ML engine training, batch processing and rendering, and microservices. Continuous integration/continuous delivery (CI/CD) can be used to build, test, deploy and run applications in a continuous process. Unreal Engine containers can support production pipelines, develop cloud applications, deploy enterprise systems at scale and other development work. www.unrealengine.com
Autodesk’s Bifrost updates include virtual sliders for feedback port changes, unknown nodes for fixing broken graphs, expressive simulation graphs, and terminals for renderable geometry.
Animal Logic's USD Alab is a fully realised USD scene, intended to encourage further collaboration and exploration among the wider community into Pixar’s Universal Scene Description (USD). Animal Logic Group has now released USD Alab as open source software.
As an early adopter of USD, Animal Logic began transitioning their Sydney and Vancouver studios to an end-to-end USD based pipeline, starting during production on ‘Peter Rabbit’ in 2017 and completing with Peter Rabbit 2 in March 2020.
Seen earlier in their 2017 open source project AL_USDMaya, Animal Logic continues to promote broader USD adoption through the release of USD Alab, intending it to serve as a reference for many USD Concepts. “We believe USD to be a foundational tool for our industry and broader communities, and we encourage the release of open source assets to educate and inspire others to conduct their own exploration,” said Group Chief Technology Officer, Darin Grant.
While open source data sets exist aleady, USD ALab is one of the first real-world implementations of a complete USD production scene. It is a full scene description from global assets through to shot outputs, including referencing, point instancing, assemblies, technical variants, global assets and shot based overrides.
“There are two downloads available, including guiding documents and two sets of textures,” said Supervising Assets TD, Jens Jebens. “The first download contains the ALab scene assets themselves, derived from our production assets and conformed for compatibility to allow them to load in any tool that supports USD. The second download is an optional extra, a production rendering Texture Pack that delivers 4K OpenEXR textures with udims for production style rendering.”
“Beyond the USD assets, we’ve included documentation showing some new ideas and concepts from our experience using USD, including the idea of render procedural definitions, an extremely useful concept that we have not seen in USD to date,” Grant said. “We hope that this combination forms the starting point for future contributors to present their own ideas for discussion, promotion and, hopefully, adoption.”
“The ALab concept was born from Animal Logic’s Art Department,” said Head of Production Technology, Aidan Sarsfield. “Handed a brief for something ‘uniquely Animal’, the team came up with a great story that revolves around a secret backyard shed inhabited by a mad scientist of sorts. The resulting asset suite draws on the unique aesthetic that you’ll find in our studios, and there’s also some fun Easter eggs in there that link back to 30 years of the Animal Logic brand.”
USD ALab is also among the first sets of assets to adopt the Academy Software Foundation’s asset license. Animal Logic wanted to allow the broadest use of these assets to promote education, training and demonstration by students, studios and vendors. “Initially motivated by a desire to create unencumbered assets for our own demonstration and presentation purposes, we realised that the industry at-large could use something similar and pushed to release them,” Aidan said. “I’m excited to see how ALab develops in the community, particularly as we will be extending the data set over time.”
The USD ALab data set is now available and hosted here on Animal Logic’s website through Amazon Web Services. animallogic.com