Digital Domain Welcomes Gabrielle Gourrier as Executive Vice President of VFX
Digital Domain appointed Gabrielle Gourrier as Executive Vice President of VFX Business. In this role, Gabby will focus on expanding client partnerships and establishing strategic initiatives.
Chaos V-Ray 7 for 3ds Max brings Gaussian Splat support for fast photoreal environments, new ways to create interactive virtual tours and more realistic, controllable lighting to 3D rendering.
VFX Supervisor Morgan McDermott at Impossible Objects talks about making their immersive film for UFC’s debut at the Sphere, combining heros and symbols of Mexican independence with UFC legends.
Chaos puts Project Arena’s Virtual Production tools to the test in a new short film, achieving accurate ICVFX with real-time raytracing and compositing. Christopher Nichols shares insights.
Moving Picture Company (MPC)has appointed Lucinda Keeler as Head of Production for its London studio, bringing over 20 years of experience and leadership in the VFX industry.
REALTIME studio has launched a Virtual Production division, following its grant from Media City Immersive Technologies Innovation Hub to develop a proprietary Virtual Production tool.
ZibraVDB plugin for Virtual Production and CG studios delivers high compression rates and fast render times, making it possible to work with very large volumetric effects in real-time.
Maxon One 2025 updates Cinema 4D, Redshift, Red Giant and Cineware, and releases ZBrush for iPad, putting ZBrush sculpting tools into a mobile device with a new UI and touch controls.
Das Element asset library software version 2.1 has new video playback controls, hierarchy tree customisation for libraries, faster set-up processes and simpler element migration.
Autodesk returned to SIGGRAPH 2024 to show software updates that include generative AI and cloud workflows for 3D animation in Maya, production scheduling and clip retiming in Flame.
Shutterstock launched a cloud-based generative 3D API, built on NVIDIA Edify AI architecture, trained on licensed Shutterstock content, as a fast way to produce realistic 3D models with AI.
Freefolk has promoted Rob Sheridan to VFX Supervisor in their Film and Episodic division and Paul Wight is now the company’s first Chief Operating Officer.
Pixotope, the live photo-realistic virtual production system from The Future Group, now runs on version 1.3 software, updated to help creators improve interactions between their virtual environments and the real-world elements.
Pixotope forms the central production hub when creating mixed-reality (MR) content for broadcast and live events, and has tools that help producers bring together physical components, such as presenters, actors, props and free-moving cameras, effectively with virtually created assets such as backgrounds, graphics, animated characters or any other CG elements. Version 1.3 has new object tracking and lighting integration functionality, and more control over colour management.
The Future Group’s Chief Creative Officer Øystein Larsen said, “The success of a mixed reality scene depends on the relationship and interactivity between real and virtual components. Part of this success depends on technical accuracy, such as matching lighting, replicating freely moving cameras and precise, invisible keying. But there is also an emotional aspect which flows from enabling presenters and actors to freely express themselves through unrestricted movement, and interacting with virtual objects as they would real objects. Version 1.3 of Pixotope improves on results in both of these areas.”
Using Real-time Tracking Data
A major advance is in the ability to integrate and use data from real-time object tracking systems. This allows Pixotope to use the positions of moving tracking locators in the real world environment and attach them to digitally created objects, so that those objects can be made to follow the tracked motion. This means presenters can pick up and rotate graphics or any other virtually generated asset in a natural, expressive manner, opening many more creative possibilities. From showing a 3D model in the palm of their hand, to controlling any aspect of a virtual scene with their own physical movement, presenters and actors become free to interact with the virtual world around them.
Another advantage of Object Tracking is that presenters themselves can be tracked, so that Pixotope is continuously aware of where in the scene they are. A challenge in normal virtual studios is that presenters must be careful about where they stand and when. Presenters cannot walk in front of graphics that have been composited over the frame, for example. However, when accessing an object's position and orientation through Pixotope’s Object Tracking interface, Pixotope recognises where the presenter is relative to the position of other generated items within three dimensional space and handles the occlusions accordingly. Actors are free to walk in front, behind or even through virtual objects.
Matching Lights and Colour
Also new in Pixotope Version 1.3 is the ability to control physical lights using DMX512 over the Art-Net distribution protocol. This enables Pixotope to synchronise and control any DMX controllable feature of physical studio lights with controls from the digital lighting set-up used to illuminate virtual scenes. As a result, light shining on the real set will match the light in the virtual scene. Lights can then either be driven via an animation pre-set, or by using a new Slider widget available for user-created Pixotope control panels. These panels can be accessed via a web browser on any authorised device and be operated either by a technician or the presenter.
Pixotope Version 1.3 also further improves the results of chroma keying for green screen studios with new functions that help extract greater detail, like fine strands of hair and shadows. New algorithms process key edges to sub-pixel accuracy, improve colour picking and automate the reduction of background screen colour spill.
Colour Management control can now be accessed in Pixotope’s Editor Viewport to make sure that artists working in any practical colour space, including HDR, feel confident of the colour fidelity of the images they are creating.
Pixotope is natively integrated with the Unreal game engine, and in Pixotope 1.3 all the new functions in UE Version 4.24 are available to users. The changes include layer-based terrain workflows that help create adaptable landscapes, dynamic physically accurate skies that can be linked to actual time-of-day, improved rendition of character hair and fur, as well as increased efficiency for global illumination that helps to create photo-real imagery. www.futureuniverse.com
Nuke 12.2 builds on updates in Nuke 12 so far, and adds a new product to the Nuke software series for compositors. Named Nuke Indie, the new software is a functionally restricted version of Nuke Studio that independent artists can use on commercial projects, and is limited to one seat per user or organisation earning less than $100,000 per year.
It aims to make Nuke’s tools more accessible to people working outside a post production studio. It combines the node-based compositing functionality of NukeX with the conform, editorial and review capabilities in Nuke Studio. It also extends the reach and adds to the community of Nuke users.
Although Nuke Indie can’t be used in a pipeline with other Nuke licenses, commercial or indie, all Nuke and NukeX nodes, including WriteGeo and Primatte, are contained in the package, with all supported Nuke Studio conform and editorial functions. It is compatible with most formats and codecs except AAC compressed audio and H.264/AVC video, and you can send your video out to work on an external monitor.
Nuke 12.2 (screenshot Aaron Sims Creative)
Nuke Indie can be used to read commercial Nuke and Hiero scripts, build simple Python integrations and use BlinkScript in the timeline and nodegraph – although Python API support is limited. Among the functional limitations are Nuke Indie’s encrypted file format (.nkind/.hroxind/.gzind) and 4K render resolution. Also, NDK and 3rd party plug-ins are not supported, and external rendering is disabled.
Users receive regular maintenance updates and licensing support, and are granted access to a community forum specifically for Indie users. The software is available as an annual subscription, purchased online.
USD Support, Sync Review, License Roaming
Native support for USD via the Read Geo node.
The Nuke updates in 12.2 is the third release of Nuke 12. One of version 12’s main drivers has been the artist experience – the UI is more interactive, and it has monitor support for high DPI images and a more stable, consistent Shuffle node. Also, data can be moved more easily into compositing due to better format support and EXR performance and, for Nuke timeline-based tools, the playback engine is rebuilt to view multiple shots in context in Nuke Studio and Hiero.
Now in 12.2, Nuke adds native support for USD, which will allow artists to read USD data using Nuke’s ReadGeo node and work with USD collections in the same way it reads other 3D formats. They can work with geometry contained within the USD with a specific scenegraph UI, support for normals, colour data and animated geometry.
Sync Review
These extensions to ReadGeo will be open sourced, making it possible to integrate the updates into custom USD tools. This initial support should also make it easier to experiment with using USD in Nuke. Studios already using USD can adapt this node to build up a more integrated user experience.
A beta feature in Nuke 12.2 is Sync Review. This first Sync Review implementation will allow Nuke Studio, Hiero and HieroPlayer users in multiple locations to review and annotate footage collaboratively. Each participant has access to the playback controls, annotations and version system during collaborative review sessions. The number of users and resolution is limited by hardware and bandwidth only. To set up a session, footage needs to be available for all participants either locally or through a central server, connecting to network storage, cloud storage like DropBox, or localised files.
Collaborative review sessions in Sync Review.
Foundry plans to extend the function further but for now, Sync Review will support teams working remotely. Push updates, like text messages and mobile alerts, are there to help users to stay in sync with the session. The Hiero API has been extended for viewer control, project loading and saving so that users can customise their own synced workflows. Note that, because Nuke Indie does not have access to Beta/Alpha releases, Indie users won’t be able to use Sync Review.
New License Roaming is available to users with active maintenance of Nuke, NukeX, Nuke Studio and Hiero, and means they can check out licenses from floating licence servers to use offline for up to 30 days at a time, for example, when going on set or working remotely.
License Roaming
Nuke 12.2 has a new MOV Reader and Writer, replacing the previous 32-bit and 64-bit QuickTime support with a more stable and flexible system for working with QuickTimes. QuickTime codec support, especially H.264 encoding, will be extended across operating systems but some of the older QuickTime codecs will no longer be supported.
Exporting DNxHR .mov files has been added to all Nuke software, and DNxHR MXF support is added to Nuke Studio and Hiero. This gives the timeline products the same functionality that was added to Nuke in Nuke 12.1. Nuke Studio, Hiero and HieroPlayer will now also support reading and playback of AAC encoded audio tracks. Nuke 12.1 supported .mov containers holding audio to Linux and Windows, which made it unnecessary to extract and import the audio as a separate Wav file. www.foundry.com
AEAF’s virtual event is taking place in just under one month. The Speaker Program line-up has grown to cover topics ranging from VFX and animation in TV shows and movies to learning how to wrap your pipeline around USD on a student budget. We are now also hard at work judging the wonderful entries to the AEAF Awards. We will soon announce the Finalists and cut the Showreel to be screened online this year for the very first time.
REGISTER to attend on your favourite screen on 14 August. See more detail about the Speakers here.
The speakers you will see and hear at AEAF this year are presenting from all over the world – Canada, the US, London and Australia – and from diverse backgrounds, and all are dedicated to visual stories and a love for film.
Alexis Wajsbrot, who has also co-directed his own film, will take you on Framestore’s winding journey with Marvel to conceive and create the illusion battle Spider-Man fights with Mysterio in 'Spider-Man: Far From Home’.
Robert Bock from Rodeo FX, is in fact an accomplished Director of Photography and now holds the special role of Head of Live Action studio at Rodeo. His photographer's eye also made him the ideal candidate to supervise VFX for ‘Tales from the Loop’, translating paintings to the TV screen.
FX Lead Luke Gravett and Character FX and Crowd TD Chelsea Shannon are talking from not one but two Animal Logic studios, Sydney and Vancouver, about how they worked their way into the roles they now fill developing tools and pipeline for feature animation projects.
DNEG’s Ben Wiggs working life has been devoted to VFX animation for 17 years, starting with commercial stories at MPC to, most recently, treading that fine line between mechanical and human motion in robots for ‘Westworld’ 3, the topic of his talk.
Nelson Sepulveda-Fauser is one of the ILM team responsible for de-aging the principle actors to help tell the story in 'The Irishman', and will talk about his team’s work on the project, an Oscar nominee, from the perspective of 16 years at ILM.
As a VFX TD who enjoys getting his hands dirty, Technical Lead Daniel Flood at UTS ALA will demonstrate how teams can start using USD, one of the fastest growing developments in VFX today, as the basis of their pipeline, from Shotgun integration through lighting and automated rendering – all without breaking the budget.
AEAF Awards’ Feature Film categories, VFX and Animation, have received the most varied lineup of entries yet. It is also very big. Everyone devoted to filmmaking – artists, animators, producers and directors – is invited to tune in and watch the AEAF 2020 showreel on 14 August, which will include breakdowns and before/after clips of many of these projects. Among them are superhero blockbusters and monster thrillers from the major studios, plus independent titles from around the world, a feature documentary and kids’ movies.
Unfortunately, several of the vendors are not able to post their VFX reels publicly in AEAF’s online section, nevertheless the work is beautiful and seeing them in the AEAF lineup is really exciting.
Weta Digital 'Jumanji: The Next Level'
It’s great to have a chance to see VFX and animation from more than one vendor on the same film. Look out for work from Method Studios and Weta Digital on ‘Jumanji: The Next Level’, Framestore and Weta Digital on ‘Lady and the Tramp’, and Rodeo FX and Framestore on ‘The Aeronauts’.
Virtual production was a critical factor in two of the movies from MPC, ‘The Lion King’ and ‘1917’. Not only did their work result in tremendous visualisations of the film’s stories but the projects became important R&D opportunities for more productions of this kind in the future.
Framestore ‘Pokémon: Detective Pikachu’
Amazing talent for photoreal animals and creatures that play starring roles is evident in several entered projects - ‘Lady and the Tramp’, ‘Call of the Wild’, ‘The Lion King’, ‘Pokémon: Detective Pikachu’, 'Jumanji: The Next Level' and more.
Terrifying monsters like MPC's 'Godzilla: King of the Monsters' are one thing but horror is another – although judging by the monster builds and animations by the artists at Rodeo FX for ‘Crawl’, Method Studios for ‘IT Chapter 2’, Cutting Edge for ‘The Invisible Man’ and MPC for ‘Underwater’, the line between them is very fine. The edge of your seat will be the safest place to watch from.
Fin Design 'Lost in Russia'
Environments are the stars in some of the projects including – among other – RSP’s race track in ‘Ford v Ferrari’, a hot air balloon’s view of the English countryside for ‘The Aeronauts’ by Rodeo FX and Framestore, wild Africa made by MPC for ‘The Lion King’ and the vast Russian winter under moonlight in ‘Lost in Russia’ created by Fin Design. This last movie, made by filmmaker Xu Zheng in China, only just missed a theatrical release after the COVID-19 health crisis closed all cinemas in China. It was made available for free viewing over producer Huanxi Media’s streaming platform.
Animated features for children – and grownups who love animation – bring great stories to life in ‘100% Wolf’ by Flying Bark Productions, ‘The Addams Family’ produced by Cinesite and ‘Upin & Ipin: The Lone Gibbon Kris’ from Les' Copaque Production in Malaysia. These three projects have very different animation styles and stories, and represent a huge amount of work.
Cinesite 'The Addams Family'
Since inspiration and courage are essential for all aspiring filmmakers, you will find both in abundance in a documentary feature film titled ‘Be Natural: The Untold Story of Alice Guy-Blaché’ from PIC Agency & Be Natural Productions. Alice Guy-Blaché (1873-1968) was the first female filmmaker and wrote, produced or directed 1,000 films during her career. The footage for this project is mainly archival, resulting from 8 years of research, and also includes work from an animation team.
MAXON has acquired Redshift Rendering Technologies Inc, developers of the Redshift rendering engine. Redshift is a GPU-accelerated, biased renderer with a wide set of tools that makes rendering complex 3D projects notably faster than many traditional rendering systems. Redshift is available as a plugin for MAXON’s Cinema 4D and several other 3D applications including Maya, 3ds Max, Houdini and Katana.
Recognising that rendering can be the most time-consuming and demanding aspect of 3D content creation, MAXON believes that Redshift’s speed and efficiency combined with Cinema 4D’s responsive workflow make an effective combination for their users. The two companies are planning to continue collaborating on the development of a very close integration of Redshift into Cinema 4D.
Redshift’s current list of customers include Technicolor, Digital Domain, Encore Hollywood and Blizzard. Projects in which Redshift has been employed for VFX and motion graphics include ‘Black Panther’, ‘Aquaman’, ‘Captain Marvel’, ‘Rampage’, ‘American Gods’, ‘Gotham’, ‘The Expanse’ and others.
Redshift and Cinema 4D Development
For Redshift and Cinema 4D users, the most important aspects of the acquisition are that Redshift pricing will remain unchanged, and Redshift can still be purchased through the usual channels. Redshift will remain available for Maya, 3ds Max, Houdini and Katana, and plans to develop a plugin for Blender will continue as planned. More integrations may be considered in the future, and Cinema 4D will also continue to support other third-party render engines through its plugin architecture. Release plans and development for Redshift 3.0 will not be affected either.
The existing combination of Cinema 4D and Redshift has been a success for reasons including ease of use, stability, reliability and efficiency that characterise both pieces of software. Redshift is a high-performance renderer that supports biased rendering for fast, noise-free renders. Designers comment that using Redshift with Cinema 4D, known for its efficient workflow, saves time and money, and supports creativity.
As well as speed, Redshift has tools that focus on photorealistic imagery that compare to images rendered by unbiased rendering engines. For example, Redshift’s RenderView Interactive Preview Region (IPR), which reflects changes to a scene in close to real-time, allows usesr to adjust the settings as they work on scenes with no downtime.
Into the Future
It is anticipated that MAXON's infrastructure and resources will give the developers of Redshift a chance to focus on their core competencies and expand their reach into the market. Although the entire Redshift company has been acquired, their team has been retained and will possibly be expanded. Their members will continue to be instrumental in the software’s future development and to work with MAXON’s product management and rendering development team.
Furthermore, while Cinema 4D is well known for motion graphics, Redshift’s development focus and plans will remain as they are including the development of Redshift version 3.0, which may progress more rapidly now. Its fast, iterative workflow and final rendering is very well suited to motion graphics as well as other 3D work. Redshift and MAXON are still dedicated to serving existing and potential visual media markets including visual effects and visualisation.
Maxon will continue to make robust API and development support available for all third-party rendering engines, should users have preferences and needs for specific rendering engines. www.maxon.net