Future viewport, the design

As outlined in the previous post there are some technical and feature targets we want to achieve. Recapping here:

1) Performance boost for drawing code. Make sure we use the best drawing method always to pass data to the GPU/Support features that are only available on new OpenGL that will enable better performance and code.

2) Node based material definition for viewport – and definition of a new real – time material system used for rendering (GLSL renderer).

3) Compositing. Includes things such as outlines, depth of field, ambient occlusion, HDR, bloom, flares.

4) Support mobile devices (OpenGL ES).

What is the state so far:

* Limited compositing (in viewport_experiments branch). When we say limited we mean that the compositor is not tied up to the interface properly, rather it just applies effects to the whole contents of the framebuffer. What we would want ideally, is to not allow UI indicators, such as wires or bones from affecting compositing. This is not too hard to enforce though and can be done similarly to how the current transparency/Xray system works, by tagging wire objects and adding them to be rendered on top of compositing.

* Some parts of our mesh drawing code use Vertex Buffer Objects in an optimal way, others do but still suffer from performance issues by not doing it right, while others do not use it at all.

How will the soc_2014_viewport_fx branch help achieving the targets?

Soc-2014_viewport_fx is providing a layer that can be used to migrate to newer or mobile versions of OpenGL with less hastle, but also tries to enforce some good rendering practices along the way, such as the requirement in modern versions of OpenGL that everything is rendered through Vertex Buffer Objects. Also it removes GLU from the dependencies (since it uses deprecated OpenGL functionality).

Also it sets in place some initial functionality so things can be drawn using shaders exclusively. This is essential if we move to modern or mobile OpenGL versions at some point.

So it mostly helps with targets 1 and 4, but more work will need to be done after merging to realize those targets fully.

At some point, if we want to support modern or mobile OpenGL, we can’t avoid rewriting a big part of our realtime rendering code. The branch already takes some care of that so the branch should be merged and worked on (merging is the first step really), unless we do not really care about supporting those platforms and features.

My estimation, from personal experiments with manual merging, is that it would take about 2-3 weeks of full time work to bring the branch to master-readiness.

Can we focus on some targets immediately?

Yes we can. Some targets such as node materials or compositing, just assume GLSL support in mesh drawing which is yet to be realized in the branch fully so it’s not really blocking their progress. However, getting the branch in as soon as possible will mean less headaches during the merge.

Viewport usability design

Draw modes

Draw modes are getting a little bit unpredictable as to what they enable and are more tied to a real time material definition limited to specular/diffuse/textured. They are also bound to the texture face data structure which is becoming less relevant since we are slowly moving to a material based approach. Often artists have to tweak a number of material and object options to get the visual feedback they need, which can also be frustrating and it is not apparent to new users either. We need a design which allows artists to easily work on a particular workflow while being able to visualize what they want without extensive guesswork of how to visualize this best. Ideally we want to drop draw modes in favour of…

Workflow modes (model, sculpt, paint, animation, game shader design)

Different workflows require different data, and different visualizations. So we can define ‘workflow modes’, which includes a set of shaders and visualization options authored specifically for the current workflow. For instance, a ‘workbench’ mode in edit mode will have a basic diffuse and specular shader with wireframe display options. For retopology, it would make sense to use more minimal, transparent mesh display, like hidden wire, with depth offsetting to avoid intersection artifacts.

Example image of edit mode display options. Some options exist to aid in specific workflows, but this is not so readily apparent

For material definition or texture painting, users might want the full final result or an unshaded version of it for detail tweaking.

Debugging (logic, rigging, etc)

Drawing can offer visual feedback to make it easier for users to examine problematic areas in their scenes. Examples include order of dependency calculation or color-encoded vertex and face counts, or even debug options available to developers.


Easy to switch from one to another, easy to config or script

Using the workflow system, users should be able to get their display to be more predictable. Each workflow mode can expose settings for the shaders or passes used but we can allow more customization than this. A node interface will allow users to request data from blender and write their own shaders to process and visualize these data in their own way. We will follow the OSL paradigm with a dedicated node that will request data from blender in the form of data attribute inputs connected to the node. The data request system is at the heart of the new data streaming design and this means that materials and custom shaders should be able to request such data. Probably even access to real time compositing will be included, though memory consumption is a concern here, and we need to better define how data will be requested in that case.


Modernize! Assume that users will always want the best, most realistic, etc.

With the capabilities modern real time shading offers, we aim to add a third render engine using OpenGL, (next to internal and cycles) which can leverage the capabilities of modern GPUs and tailored to make real time rendering a real alternative for final rendering in blender. A lot of the components are already there, but we can push it further, with shader implementations optimized especially for real time rendering instead of trying to mimic an off-line renderer.

We want to make sure that our material display is pleasing, so we are exploring more modern rendering methods such as physically based shading (a patch by Clement Foucault using notes from Unreal Engine 4 is already considered for inclusion) and deferred rendering.

Needless to say this will also mean improved preview of materials for blender internal and cycles.

43 comments
  1. Hi there,
    Any chance of PBR shading with shadow, local reflection to be released soon?
    The idea of few seconds per frame vs 2.5 hrs is really a game-changer.
    Blender is really going the right way and hope to know when we can see it.

    Cheers

  2. Will this Run the game engine?

    Can the faster draw calls be exploited there yet?

  3. Hi there!
    Please have a look at presentation from NVidia engineers about optimizations of OpenGL rendering pipeline. I hope this can be useful for Blender developers:
    http://www.slideshare.net/CassEveritt/approaching-zero-driver-overhead
    Thanks

  4. Thank you so much I think now blender going to have huge performance with this new viewport.

  5. Hi, is there any chance to have viewport displacement, like in maya viewport 2.0 and inextensible hair strands gpu realtime rendering in viewport like tressfx or hairworks if openGL 4 is supported in near future ?
    Regards.

  6. A physically based real-time renderer would be fantastic!

    P.S. Happy NY

  7. no more need for matcaps in the sculpting room at all, a better way of visualising what the scene actually looks like without having to wait for ages for the cycles viewport to render the scene and get rid of enough noise for you to be able to have a basic understanding of what it looks like, or having to wait for a full render to be able so have a basic understanding of hat the whole scene would look like, this real time renders would give us real time results, what would also be cool is that if blender used a unified material system that cycles, new viewport real-time render would use so that you build the materials once and they would work inside blender cycles or the new viewport effortlessly without extra work.

  8. Yes! Real time rendering! I’d love to see the fruits of the candy branch (mainly area lights and SSS) finally in the tunk. I do love the real time SSAO and DOF (only NVIDIA works) in the Gooseberry branch, but this PBR & the same shaders running well in the GE would really help a lot of us out in the Motion Graphics business. I could imagine combining them with GLSL 2D shader effects would yield a very powerful combination for a lot of 3D work.

  9. I’m really looking forward to this feature being included in a major release. The use cases for real-time rendering are so numerous, especially for motion graphics work. Hence the huge popularity of software like Element 3d (as others have mentioned) which I’m sure many of us also use. It will be awesome to be able to do these kind of renders right inside Blender now instead of having to export OBJ sequences for external use. I love this software!

  10. The viewport project looks amazing :) I’ve tested the latest gooseberry build, and am I the only one to get a viewport freeze when moving the DOF widget/locator in the viewport? I can only close Blender. It’s ok if I change the focus distance in the camera settings though.

  11. Hello, First many thanks for you amazing work! I am coming from 3dsmax background and what I am really missing is colored wireframe in working viewport. Is this on your list to do as well? Many thanks Filip

    • Not familiar with 3DS Max, so this could be some specialty feature I’m not understanding. But I beleive you would be able to change the color wireframes display as by playing with your user preferences-I’ve done things like this before.. Check around with tutorials about user preferences for more detail.

  12. That reminds me on TrueSpace 7.6.
    Might be Roman Ormandy was somewhat ahead with his ideas, only the software was not that stable.

  13. I’d like to see updates in blender BGE.
    Something in bgl. The BGL is perfect to create 2DFilters, but still miss resources like bgl_NormalTexture or some way to do this.
    But this current update is tooo nice! Thanks and good luck ^^

  14. This is fantastic news, thanks for getting this implemented into Blender! Have you worked out an implementation of how shaders for specific workflows can be setup yet? I haven’t really used node-based shading so sorry if this is a stupid question, but will individual shaders or node setups be exportable, so you can store and use them when you need to?

    Also, will specific shading models like a UE4-based PBR shader for example, be usable across different workflows?

  15. Hi Psy-Fi,

    While in essence (from a code perspective) it is a new render engine, would it help artists better understand its purpose, prevent any misunderstanding and questions about which render engine they should use, if instead you call it a new ‘display engine’?

    Thank you for your hard work and time,

    David

  16. Amazing, I hope more and more to use this type of technology to be able to render fast and beautiful, because unfortunately not everyone has a RenderFarm at home… And you do not always need to be photorealistic :D
    Some hope to get something like Element3D realtime rendering or better in the future? (https://www.youtube.com/watch?v=P3BKw9jK-kI)
    At the moment I’m playing with the Unreal Engine 4 for animations xD

  17. This will be so cool. Although a separate realtime render engine where you can modify all stuff is a very good addition, I look forward to a realtime mockup preview of Cycles or BI the most. I know the Rendered viewport option is there, but sometimes that can be slow. I hope it works like this:
    Let’s say you put a Sun with shadows in Cycles, the shadows will be pathtraced in the final render but the OpenGL preview uses cascade shadow mapping or something like that. Or let’s say you use raytraced ambient occlusion in BI for the final render but the OpenGL preview uses SSAO, and when you switch it off or change the distance in the World settings, it changes for both the render and the preview. And so on for reflections, energy conservation, any material node setup or math, etc. Doing this for Cycles will give you the realtime PBR you want.

    My main concern is the fragmentation of so many different viewport shading modes and options inside those shading modes, as well as per render engine. Ideally, Blender would come with a default set of viewport shading modes, but users would be able create or modify custom shading modes for each engine, suited for their needs with any combination of stuff they want, wireframe, post processing, vertex shading, shadows, matcaps, whatever. And they may do that through nodes and/or user preferences. So later they can choose their custom shading from the same dropdown menu and same key bindings currently used for viewport shading modes.

  18. Does this mean we will see significant changes to the game engine too? The way I have understood it so far, the lack of OpenGL ES support is what has been hindering blender (and the BGE) from working with Android/iOS.

  19. Any chances of procedural textures in realtime?

    • There is a patch in the tracker. One issue with such textures is that they take long to compute in the pixel shader, but that would be up to the user of course. Probably we’ll have to warn about that in the documentation.

      Also, some older cards have a limit on the number of instructions in the pixel shader. We should really target shader model 3.0 hardware here but there’s no easy way to detect this in OpenGL.

  20. That sounds amazing, really excitung ahead, I was dying for having an open source real time render engine, and now you plan to include one directly in Blender ! We blender users are so blessed!

    About the material workflow, it would be awseome to have a unified workflow for material creation, i.e kond of the same nodes for both Cycles and this new render engine. It would be such a killer feature !

    Thanks for your hard work and good luck for this project. :)

  21. One of the biggest priorities for me would be better visualization of cycles materials, especially displacement and specularity. The realtime renderer also sounds useful for some applications.

  22. is there a way that this can be drawn quickly? currently renders are quite slow on my machine(moving the camera and making changes at 0.1-1FPS would be quite time consuming)- or will this be made to accommodate only very high end workstations(2+ high end GPU,etc)?

    • Again, we are targeting OpenGL 2.1 hardware level for now, with possible upgrade to newer versions in the future. By slow renders do you mean the viewport itself or OpenGL rendering? Which mode is it in? The current viewport performance varies greatly depending on the selected mode.

      With the planned changes I expect the viewport for material mode will become much more responsive.

      Of course the better the GPU, the more polygons and effects the viewport will juggle.

  23. A very exciting new direction for Blender! This will really compliment Cycles nicely at the other end of the spectrum. I can see myself using this for a lot of motion graphics work in the future, where photorealism is often not required, but speed and flexibility is.

    A nice PBR glossy/metallic workflow combined with some of latest realtime tech seen in modern engines(such as UE4 and Element3d) like screenspace reflections and voxel based GI, will hopefully be option in the future. In tandem with the compositing nodes, this will create a unique workflow that no other existing DCC apps have yet :)

  24. Great news! Please look at Element 3D v2 for inspirations.

  25. A physically based real-time renderer would be fantastic! The current viewport renderer is one of the major things holding back Blender. Are there also plans for temporal anti-aliasing and screen space reflections like in UE4?

  26. This is all technical stuff , which is way over my head , but on a practical level as someone who uses blender for about 70 hours a week there is smaller things I would like to see. I have a render folder assigned on my hard drive , for renders ! , I would love to have a folder assigned for material presets (internal and cycle nodes), that can be accessed via the view port independent of data blocks associated with the current blend file, blender must be the only software I have used that doesn’t have this feature ? I would love to have sub directories for vertex groups and shape keys , It is no fun at all scrolling through a hundred shape keys to find what your looking for , If I could put ‘Facial’ shape keys in a sub folder and then ‘Body’ shape keys in another I would find the work flow a lot better , and vertex groups in the same way , This is purely to isolate them on a visual level , And I would love to have full drag and drop U.i , maybe even an editing U.i mode , there is so many things I don’t need cluttering my workspace , but I don’t have the option to remove them…. I love the work you guys do , but these are things that affect me on a daily basis , rather than what you do under the hood

    • Dean: that’s the asset system we’re working on as well.
      Design proposal will be presented for review within weeks.

      • Hey Ton , sounds like your working on the best Christmas present I could of asked for , I’ll get my blender cloud subscription restarted , Hope you have an awesome Christmas & New Year.

    • I would second this request for folders for vertex groups and shape keys. It’s currently not even very ergonomic to organise shape keys in order, you have to click them up and down the list, etc.

  27. This is very exciting, and sounds like the modes approach a more user-focused approach. I can’t say enough that there is great potential in the texture and material tools in Blender, but I really think if we could direct a composite node output to a texture while it is being worked on, that woudl greatly improve the ability to test it on the model. I am wondering if the part above about Frame Buffer has anything to do with that…

  28. Your talking here about the edit screens, getting an almost cycles like view, something like the unreal game; as you mentioned.

    Besides those “work screens”, will there also be updates on the main cycle engine for when people do the final render phase ?.

    Also people who dont have special GPU’s on board will blender be still able to run on such machines (if yes will it be slower or faster ?)

    • Cycles final rendering will be unaffected, this is for real time preview and for the new real time engine. For system requirements we are aiming at an OpenGL 2.1 compatible hardware with plans to move to 3.2+ at some point in a few years. Performance should be better in material mode for current renderers but of course the more effects one enables on the viewport the heavier it will be on the graphics card so a better card will always help here.

      • wil it be game over for people who dont have a GPU onboard ?

        • For people without a GPU supporting OpenGL 2.1 it will be a problem, but it will be a few releases from now until this takes effect.
          I highly doubt any machine without a GPU is able to handle blender even now. If blender works on your machine there’s a very high chance that you have a GPU, maybe a cheap on-board or Intel one. And all GPUs sold for the last 7 years should have OpenGL 2.1 support (OpenGL 2.1 was out in 2006 – almost 9 years ago)

          • There’s hardware from 2003 with support for Opengl 2.1, almost all hardware from 2004-2005 and newer should work with opengl 2.1 just with using drivers from ~2006+

            Anyway why use ~2004 hardware with blender? is like using a 386 to crack passwords.

          • When can we hope to see the improvements?

            what about in the game engine?

            modern GSL ?

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.