
GPU-accelerated, flexible, and with minimal file overheads
#Nuke 13 new features manual
The results aren’t intended to be perfect, and will probably need some clean-up, but the workflow should greatly reduce the amount of manual paint work required on a job. Suggested use cases include automating repetitive tasks like roto, garbage matting, marker removal and beauty work, but the system works with “any image-to-image task”.Ĭop圜at being used to train a neural network to isolate an actor automatically from the background of an image sequence, based on six source frames in which the roto work has been done manually.Īnother Foundry Live demo showed Cop圜at being trained with six frames of a sequence in which an actor’s beard had been painted out manually then Inference painting out the remaining 100-plus frames. The trained network can then be processed by the Inference node, which then applies the same operations automatically to the remaining frames in the sequence. Users feed in both raw frames and ‘ground truth’ images – the same frames after VFX operations have been performed on them – for Cop圜at to generate a neural network from. The process begins with the Cop圜at node, which processes source images to train a network. Train neural networks with the Cop圜at node, then process them with Inference Instead, the framework enables users to train their own neural networks to perform VFX tasks specific to a particular image sequence, then to share those networks with other artists.
#Nuke 13 new features software
Unlike the AI-based features recently added to software like Autodesk’s Flame, it isn’t simply a set of readymade tools that have been trained using machine learning techniques.

However, the new AIR toolset – the name just stands for ‘AI Research’ – is the first appearance of machine-learning tools in the core software, and in a form that dispenses with the need for an external server. Machine learning has been a research focus for Foundry some time now: in 2019, the firm released source code for ML-Server, an experimental server-based machine-learning system for Nuke, on its GitHub repo. Unlike many of the advanced tools, it’s available in both Nuke and the lower-priced Nuke Indie, as well as NukeX and Nuke Studio, the extended – and more expensive – editions of the software.įoundry’s website provides an overview of what AIR does, but the firm shared a bit more information in a livestream earlier this week, as part of its Foundry Live 2021 user events.īelow, you can find out what Foundry revealed about machine learning workflows in Nuke during that session, along with a bit of educated guesswork about how the toolset might evolve in future.ĪIR: a new machine learning framework that lets VFX artists train their own neural networks


They include native Cryptomatte and Python 3.7 support, and a new viewport architecture based on Hydra, the USD rendering framework, intended to streamline workflow with other USD-compatible software.īut for us, the really interesting feature was the AIR toolset: a new machine learning framework that enables VFX artists to train their own neural networks to automate repetive tasks like roto and marker removal.
#Nuke 13 new features update
There are a lot of significant new features in the Nuke 13.0 product family – Nuke, NukeX, Nuke Studio and Nuke Indie 13.0 – the latest big update to Foundry’s compositing software, released earlier this week.
