Real-time Compositor

In this article, Omar introduces the real-time compositor project, its current state, and shows off some of its capabilities through a series of demos.


The Project

The aim of this project is to develop a new compositor back-end, taking advantage of GPU acceleration to be performant enough for real-time interaction.

As a first step, this new back-end will be used to power the Viewport Compositor, a new shading option that applies the result of the Compositor Editor node-tree directly in the 3D Viewport. Artists will not have to wait for a full render to start compositing, allowing for faster and more interactive iterations on one’s projects.

In the long term, the goal is for it to power the existing Compositor Editor.


How To Use It

To use the Viewport Compositor, you will need to use an experimental branch:

  • In Preferences → Interface → Display, enable Developer Extras.
  • In Preferences → Experimental, enable Real-time Compositor.
  • In the 3D Viewport, turn on Compositor under the Viewport Shading panel while in Material Preview or Rendered view.
  • Open the Compositor Editor and start compositing as usual.

Note: not all nodes have been implemented yet. See the list.

How to enable the Viewport Compositor

Demos

In this scene, we add a watercolor texture for the background, overlay that texture on the ghost, generate a smooth outline from the blurred alpha, and mix it with the image.

Ghost demo

Next, a simple product visualization of a flashlight with a transparent background rendered in EEVEE. A dark background is added via the Alpha Over node, later a sharpen filter, add some lens dirt from an image, lens distortion, and finally some color grading.

Flashlight demo.

In fact, one can just ignore the 3D Viewport and simply use external resources, for instance, a movie clip.

Using external sources.

Current State

Even though the project is still a work-in-progress, it is already in a usable state. The plan is to land it in master as an experimental feature and continue development there. Being a work in progress, many of the nodes are still not supported and there are a number of known issues and limitations, refer to T99210 for more information on the current state of the project.

Try It Out

Experimental builds are available at builder.blender.org. The code has been already submitted for review, so it should be available (marked as experimental) as part of the daily builds soon (not for Blender 3.3 since the window for adding large features to it is over).

Support the Future of Blender

Donate and support Blender Foundation to work on core Blender development.

55 comments
  1. why
    every time i try to do something it crashes ?

  2. Where can I download this branch? Has he been merged into the new blender yet? I clicked through the link and didn’t find this branch.

    • Download from here instead:
      https://builder.blender.org/download/experimental/archive/temp-viewport-compositor-merge/

      We plan to merge it to master this week, so it should be available as part of the daily builds soon.

      • “We plan to merge it to master this week” — Incredible news! Thank you and those working with you on this project for all your hard work, Omar. Here’s hoping for inclusion in 3.4, but if not just that means I’ll be even more excited for 3.5 <3

        • This looks incredible and I want to try it out right now. If I want to build it myself, should I just build master or is there a better branch to follow for the latest work?

          • The latest code is a series of patches, so it is a bit hard to build yourself.
            But I will hopefully merge it to master next week, so you can probably just wait until it is merged.

  3. Thank you!! This is so awesome! I just need the Cryptomatte node! really can’t wait until the Cryptomatte node is implemented.

  4. I dont get the Option to enable realtime compositor, Using the latest 3.3 alpha on windows

  5. Amazing work,always wanted to have a real time compositor in blender,one question though
    will the render passes that cycles is creating will be able to be used in real time? For example,
    mist and z-depth would be the best for some real time mist!.

    • Yes, multi-pass compositing will be supported, including using Cycles passes.

      • Thats great! Cant wait to use it! Thanks for the quick reply.

  6. I tested it and it’s already useful! now what I feel to want in the future is:
    – composite-aware mesh editing like modifier’s “on cage” option
    – non-current scene support in render layers node

    thx

  7. Awesome! It immediately worked. I’m excited for more nodes to work. Especially Z-Depth.

  8. Fantastic work! I’m beyond excited for this!

    In the future, are there any plans of getting the viewport camera preview to match 1:1 with the final composite? Currently there are a ton of resolution-dependant effects that render viewport previews a bit moot.

    The ability to preview the active camera at the output resolution in the viewport window would solve a lot of these discrepancies.

  9. Really nice work, I’ve been dreaming of this for ages:
    I wonder do you have plans to add what are currently the “grease pencil vfx modifiers” to the mix, since they’re pixel level type of FX I think they’d be a good fit to convert and unify with this!

    in any case great job Omar! looking forward to see this evolve!

  10. I am beyond amazed by your work. Would it be terribly difficult to implement real-time video I/O? It‘d be extremely cool to use Blender as a compositing tool in streams or video meetings. The keyers in OBS or Zoom are just terrible and colour correction is simply impossible. But with Blender we could have almost Nuke-style 3D-compositing but real-time… ^^

    • I am not familiar with the requirements of real-time video IO, but it is probably out of the scope this project for the moment.

  11. Wow. This thing is mine blowing. It’s actually something I dreamed about a long time ago. C4D’s cineware Plugin for after effects tried to implement something like this but failed miserably.
    1. because C4D’s renderers are slow as f***
    2. After effects is very slow and inefficient
    3. The biggest reason: You had a static layer inside of afx so there is not a real benefit over image sequences. And that is the ultimate beauty of your approach, being able to not only finetune the image in comp but still being able to adjust everything in the 3D scene. Like changing literally everything.

  12. Amazing! Thank you

    Is it also a step to get a possible “compositor per texture” at some point ?
    Allowing to do diverse operation on a Image/Texture (or whatever the compositor can do) and get the result as a Texture in Materials ?

  13. Great work. This is one of my most anticipated features so far!

    But I do have a question though, sorry if it was already answered. Will this work with Cycles as well? All the demos are Eevee only.

    • Yes, it will work with Cycles and any render engine that supports viewport rendering.

  14. Does this mean that it’s possible to make Freestyle real-time? I hope to heavens if it’s true?

    • I don’t think this will accelerate freestyle in anyway, since the bottleneck for freestyle is not compositing.

  15. This is incredible!

    But the most question, that disturbs me, is will there be a possibility to compose render layers with different egines? For now I see VC works only with active scene’s render engine. No matter if I split shader output for EEVEE and Cycles and use those engines in separate scenes – for both scenes shader compiles only for active scene render engine. But if we could get possibility to mix EEVEE and Cycles in realtime – that would be REALLY huge thing!

    Can we expect such a feature?

  16. Are there plans for more realistic glare and bloom? Or adding a uber node – a kind of node with many features, all in one, like exposure, white balance, sharpening, highlight compression, LUTs, etc.?

    • We can definitely extend the compositor with new features, but new nodes is out of the scope of the project and will not be a priority for me for the moment.

  17. Very cool. What does this mean for Blenders future in virtual production? Is the “interactive mode” plan still a thing?

  18. If only Blenders’ compositor supported third party plugins :(
    But I assume it’s technically impossible in the current implementation.

    • ummm… it’s open source. anybody can write plugins for it.

  19. such a good addition to blender, can’t wait to test it when the filter nodes are in, gj !

    • same! i can’t wait for the glare node to not be slow asf on high quality lol.

  20. So cool! this is magic!
    I guess we can say goodbye to the bloom option in the Eevee render panel. I always thought it was a little bit out of place anyway.
    BTW would we have multi subeditor compositor? Something like how shader has different subeditors for world, material, and line style.
    I imagine one for the Viewport, one for Render (the current one), and one for the Video sequence. This way we are neatly getting around the dependency problem.

  21. You don’t know you need such a thing until you see how it works. I believe this project will open up a lot more opportunities for Blender and bring even more users.

    This new feature is very powerful to speed up ideation and initial post-processing. Letting us to see the final result even sooner. I believe it’s somewhat similar to what V-Ray post processing capabilities which I was a bit envy for. If Blender improve it’s compositing with realtime and speed up final compositor it’s going to be a huge jump in how very useful this combination can be.

    Especially taking in account that doesn’t depend on a specific renderer. So many opportunities opens up with this!

  22. Please consider converting VSE strips and sequences into data blocks, so they can be manipulated this way.

    • Are you talking about the possibility of applying the real-time compositor on a VSE strip? While this is outside of the scope of the project, this is actually a possibility we took into account while developing the real-time compositor. See the “Reusable Compositor Node Tree” section in the UX document in the devtalk thread:
      https://devtalk.blender.org/t/real-time-compositor-feedback-and-discussion/

      • I think this is a good idea. This would a allow to replace the current effects and strip modifiers in the VSE with a solution that is GPU accelerated and shares code with the rest of blender. Perhaps, each VSE strip could simply become a separate scene with a “Movie Clip” input node by default?

  23. Please, please, implement Cryptomatte’s ID material, objects and masks.

    • It’s planned! If you look at the task, Cryptomatte is there.

  24. This is gonna be such a great improvement for workflow.

    Once this lands, and Eevee Rewrite arrives too, (really looking forward to that SSGI and POM), will be able to do so many things in real time, right in the editor viewport. Gonna be awesome.

    • Can the new eevee be implemented in 3.3 please?

    • Now if only we could get IES/ light projection in eevee then it’ll be the ultimate haha. Those are the last things on my wishlist haha

  25. Great work, thank you for your efforts.

  26. Why 10 different nodes to merge the images instead of just one uber merge node with multiple blending modes??

    That also allows testing different blending modes without creating and reconnecting a new node each time.

    PS:
    horizontal compositing… errr

  27. Viewport compositor + Cycles + Lighgroups… OMG! I want it now! :D

  28. This is amazing work, this is gonna multiply realism so much while using so much less effort

    Thank you!

  29. Will it work with Cycles and the other render engines available for Blender?

    • Yes. Currently, the viewport compositor operates on whatever is drawn on the viewport. So any render engine that supports viewport rendering is supported, including Cycles.

      However, render passes are not yet supported and will likely require special handing by engines.

  30. gee whiz, it’s Christmas

  31. Amazing work, Omar! I’m really excited to make use of this workflow for multiple render layers when that feature is supported.

    On a side note: why was it decided to do the viewport compositor before improving the performance of the regular compositor? Is optimizing the regular compositor harder than implementing the viewport compositor?

    Thanks!
    -Tim

    • The viewport compositor was part of the 2.90 roadmap for the EEVEE & Viewport module, and it was limited in scope to the requirements of viewport compositing. However, as the project started to take shape, it was realized that the implementation could be generalized to also benefit and accelerate the regular compositor as well. So the scope of the project was later broadened to include the regular compositor on the long run. In other words, it was less of a decision to prioritize one over the other, but rather a natural progression of the project.

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.

.wp-block-button__link {text-decoration: none}