
In July 2024 the NPR (Non-Photorealistic Rendering) project officially started, with a workshop with Dillo Goo Studio and Blender developers.
While the use-cases were clear, the architecture and overall design were not. To help with this, the team started working in a prototype containing many shading features essential to the NPR workflow (such as filter support, custom shading, and AOV access).

This prototype received a lot of attention, with users contributing a lot of nice examples of what is possible with such system. The feedback showed that there is a big interest from the community for a wide range of effects.
However the amount of flexibity made possible with the prototype came with a cost: it locked NPR features within EEVEE, alienating Cycles from part of the NPR pipeline. It also deviated from the EEVEE architecture, which could limit future feature development.
After much consideration, the design was modified to address these core issues. The outcome can be summarized as:
- Move filters and color modification to a multi-stage compositing workflow.
- Keep shading features inside the renderer’s material system.
Multi-stage compositing
One of the core feature needed for NPR is the ability to access and modify the shaded pixels.
Doing it inside a render engine has been notoriously difficult. The current way of doing it inside EEVEE is to use the ShaderToRGB node, which comes with a lot of limitations. In Cycles, limited effects can be achieved using custom OSL nodes.
As a result, in production pipeline, this is often done through very cumbersome and time consuming scene-wide compositing. The major downside is that all asset specific compositing needs to be manually merged and managed inside the scene compositor.
Instead, the parts of the compositing pipeline that are specific to a certain asset should be defined at the asset level. The reasoning is that these compositing nodes define the appearance of this asset and should be shared between scene.
Multi-stage compositing is just that! A part of the compositing pipeline is linked to a specific object or material. This part receives the rendered color as well as its AOVs and render passes as input, and output the modified rendered color.

In this example the appearance of the Suzanne object is defined at the object level inside its asset file. When linked into a scene with other elements, it is automatically combined with other assets.

This new multi-stage compositing will be reusing the compositor nodes, with a different subset of nodes available at the object and material levels. This is an opportunity to streamline the workflow between material nodes editing and compositor nodes.
Grease Pencil Effects can eventually be replaced by this solution.

There are a lot more to be said about this feature. For more details see the associated development task.
Anti-Aliased output
A major issue with working with the a compositing workflow is Anti-Aliasing (AA). When compositing anti-aliased input, results often include hard to resolve fringes.

The common workaround to this issue is to render at higher resolution without AA and downscale after compositing. This method is very memory intensive and only allows for 4x or 9x AA with usually less than ideal filtering. Another option is to use post-process AA filters but that often results in flickering animations.

Right: Anti-Aliasing is done after compositor.
The solution to this problem is to run the compositor for each AA step and filter the composited pixels like a renderer would do. This will produce the best image quality with only the added memory usage of the final frame.
Converged input
One of the main issues with modern renderers is that their output is noisy. This doesn’t play well with NPR workflows as many effects require applying sharp transformations of the rendered image or light buffers.
For instance, this is what happens when applying a constant interpolated color ramp over the ambient occlusion node. The averaging operation is run on a noisy output instead of running on a noisy input before the transformation.

Doing these effects at compositing time gives us the final converged image as input. However, as explained above, the compositor needs to run before the AA filtering.
So the multi-stage compositors needs to be able to run on converged or denoised inputs while being AA free. In other words, it means that the render samples will be distributed between render passes convergence and final compositor AA.
Engine Features
While improving the compositing workflow is important for stylization flexibility, some features are more suited for the inside of the render engine. This allows builtin interaction with light transport and other renderer features. These features are not exclusive to NPR workflows and fit well inside the engine architecture.
As such, the following features are planned to be directly implemented inside the render engines:
- Ray Queries
- Portal BSDF
- Custom Shading
- Depth Offset
The development will start after the Blender 5.0 release, planned for November 2025.
Meanwhile, to follow the project, subscribe to the development task. For more details about the project, join the announcement thread.
Support the Future of Blender
Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases.