DMW Awards Partner

Brainstorm is participating in European Project EMERALD to develop and demonstrate AI, Machine Learning and Big Data tools for the digital entertainment and media industries.

Brainstorm EMERALD project

Brainstorm, specialists in real-time 3D graphics, virtual sets and AR systems, is participating in European Project EMERALD titled ‘AI and Process Automation for Sustainable Entertainment and Media’, funded by the European Commission under the Horizon Europe Programme.

The project is an interdisciplinary Consortium of seven partners including companies from the movie, broadcast, streaming and live entertainment technology sectors – British Broadcasting Corporation, Brainstorm Multimedia, Disguise Systems, Filmlight and MOG Technologies – supported by two major European universities – Universidad Pompeu Fabra in Barcelona and Trinity College in Dublin. It has received funding from the European Union’s Horizon Europe research and innovation programme.

"EMERALD strives to develop formative tools for the digital entertainment and media sectors by exploiting the potential of AI, Machine Learning (ML) and Big Data applications. Its wider goal is to modernise processing, enhance production efficiency, minimize energy consumption and raise the quality of content through innovation," said Francisco Ibáñez, R&D Project Manager at Brainstorm.

ML and Automation

"Currently, there is a massive increase in the volume of video-based and extended reality content, with a demand for skilled human resources, data processing and energy to deal with it, that cannot be met. This project aims to address this challenge through the development of process automation for sustainable media creation.”

EMERALD aims to apply ML to automate some of the most labour-intensive tasks involved in video content production, which have considerable implications for the use of time and energy. Javier Montesa, R&D Technical Coordinator at Brainstorm, said that Brainstorm in collaboration with University Pompeu Fabra (UPF) will develop methods based on Deep Learning, and tools for video matting. Video matting refers to techniques used to separate video into layers that identify the foreground and background, and to generate alpha mattes that determine how the layers should blend together.

Brainstorm EMERALD project2

Project EMERALD at Universidad Pompeu Fabra

“The main objective is to be able to automatically integrate remote presenters or performers into virtual scenes and sets in real-time, while producing results of high enough quality for broadcast/streaming media using deep learning, without the need of a trimap,” Javier said. “With the AI enhancement of Brainstorm’s InfinitySet, we will bring the quality achievable through green-screen methods to simpler configurations.” A trimap is a one-channel map that represents the absolute background, the verified foreground, and the unknown regions of an image.

InfinitySet Integration

Javier explained that Brainstorm plans to integrate InfinitySet with UPF Deep Learning systems able to estimate the actor’s head and body pose. This information will allow the operator to trigger content like shadows and reflections that will be displayed automatically on different parts of the scene, virtual screens, 3D graphics place holders, or simply in front of the presenter as he or she moves around.

Francisco noted, “We will explore new ways to improve presenter insertion, to billboard and twist the silhouette, or to calculate its shadow with more precision and realism.”

Furthermore, Brainstorm will be involved in the creation of tools designed for automated colour balancing and matching shots. Colour manipulations are typically required in post-production, in VP and in broadcast virtual studios. Automating the colour grading of the presenter to match the virtual scene will, of course, be valuable for broadcast virtual studios that do not have colourists.

“The integration of this automated colour correction in InfinitySet will simplify the use of the tool and improve the presenter-scene integration when lighting conditions are not controlled or when virtual scene lighting conditions are meant to vary during a programme,” said Javier.

The progress of the project and the results obtained can be followed here, as well as on the project's social networks.   www.brainstorm3d.com