Blender 2.8 Viewport Development

Introduction

Blender 2.8 original design brought to the conversation a workflow-based usability mantra. Within those constraints we went over the ideal pipeline for a few well defined workflows.

Here is an excerpt from a recent viewport development design document which is still valid for this proposal, and summarizes the principles here presented:

“There’s a fundamental difference between an object’s material appearance and the way that object is displayed in the 3D view. Materials are part of your data, just as geometry is. One way to display is full material preview. When you want a high quality rendering of a lit scene, or you are working on the materials themselves, this is the best choice. There are other ways to view your data, useful in different situations or stages in your workflow.

Answer the question: what about your object do you need to see right now? Maybe the polygonal structure of your geometry is important — especially true while modeling! In this case you want to see the faces and edges clearly. Lighting and materials on the surface are less important and more likely a distraction. Sometimes the shape or form of an object is most important — sculpting is the best example. Individual polygon faces are less important, edges and materials are distracting. Or draw in solid black to discern your object or character’s silhouette” (merwin, June 2016)

To align the discussions with the artists and the technology we are bringing up for Blender 2.8 we tried to introduce a few new concepts: tasks (individual components of a workflow), plates (rendering techniques and interactions within a viewport) and minimum viable products (a few main early deliverables of the development cycle). These concepts are each explained in their own section.

Out of those discussions with a small group of artists and developers, we came up with the present design for the viewport. The ideas here presented will evolve during the project and once they start to be faced against real-world implementation challenges.

pbr

Three pillars of this project

Performance and Responsiveness

  • More responsive navigation (fast drawing)
  • More responsive to user actions (low latency editing, updates, selection)
  • Handle more complex objects / scenes with ease
  • Work on a wide range of hardware
  • Be power efficient (update screen only when needed)

Task appropriate shaders

  • Higher quality visuals (real-time PBR, rich material preview, solid outlines, shadows)
  • Different shading modes for different objects
  • Improve usability (visual cues, showing selection, task-centric drawing techniques)

Integration with external engines

The new viewport will allow for more accurate preview of the external engine your project is targeting. Each engine may have different material and shading models, as well as unique screen effects. Whether you’re making assets for a movie or for a game, the idea is to see what it’s going to look like while you work on it.

  • Offline renderers: Cycles (i), Luxrender (ii), Renderman (ii)
  • Real-time renderers: Blender Internal (iii), p3d (ii), Sketchfab (ii)
  • Game engines: Unreal (ii), Unity (ii), CryEngine (ii), Blend4Web (ii), Armory (ii), Godot (ii)

(i) For Cycles support we will use the PBR implementation docs by Clément Foucault as a reference [link].

(ii) Proper export of assets (with materials) to your engine of choice is not strictly part of the viewport project, and neither is the specific support of its shaders. The project will support Python addons so other developers can work on those integrations.

(iii) We intend to retire Blender Internal as a standalone offline renderer, and make the real-time viewport take over its duties. We plan to support its material system and convert existing Blender files relying on the old engine to their OpenGL equivalent shaders seamlessly whenever possible.

Tasks

Task is an individual step that is part of a larger workflow. For example, the texture painting workflow can be split in a few atomic tasks, such as painting a texture chanel, checking the result combined with the other maps, preview a final render.

Each task has also a mode associated to it (e.g., sculpting, painting, editing, …). However the same mode (e.g., mesh edit) may be required for different workflows and tasks (e.g., retopology and asset modelling).

Even though we want to support customization, we will build a pipeline based on artists’ feedback with well defined presets for each task. The list of to-be supported tasks can be fleshed out during the upcoming usability summit.

That being said, a few workflows will be elected for the initial deliverables, and are listed here as the initial minimum viable products.

Switching between tasks should be as simple and fast as picking one from a pie menu.

Plates

Different tasks require different visuals. For example, for layout you want to see the scene lighting, depth of field, shadows and reflections in real-time. For asset modeling you may need a basic studio light setup, polygon edges, special edges such as sharp or UV mapping seams.

Every one of those individual components (depth of field, studio light, front wires, …) is called a plate. They are composed together in a stack or processed in parallel depending on each plate. If you want the edit object to show in a fully lit (PBR) shader, that’s a plate. If on top of this we have a highlight outline for the edit object, that’s another plate. If we have all the other objects in a simplified representation (no subsurf) with a clay shade, this is another plate. If we have depth of field on the viewport, that’s yet another plate.

A key point is that plates are drawn independent of each other. Manipulator widgets or selection outlines don’t need to know whether they’re being drawn atop a simple solid shaded scene versus a full Cycles rendered scene. Separating all our different drawing techniques into independent plates is going to allow mixing & matching & combining of techniques to assist the task at hand. On the developer side, this gives us markedly cleaner and easier-to-maintain drawing code. New visuals can be implemented with little worry about interfering with how other plates are drawn.

An extensive list of plates can be found on the Appendix I.

Minimum viable products

The viewport project is structured to gather continuous feedback from artists during its development. In order to balance working in the code infrastructure and showing useful results for artists to jump in right away, we defined a milestone per project pillar.

Each milestone represents the initial code to consolidate a pillar’s full implementation, as well as a fully functional artistic working experience. Those specific milestones will be treated as MVPs—minimum viable products—and should illustrate the benefits of the viewport project for a certain workflow.

The MVPs were chosen based on their gradual implementation complexity, their clear correlation with the proposed pillars, as well as their relevance in an animation studio pipeline.

  1. Character modelling
  2. Camera and staging
  3. Texture painting

1. Character modeling

“Performance and responsiveness”

Deliverables:

  • Huge performance gain
  • Beautiful editing mode
  • Groundbreak to illustrate how wireframes are a poor choice for some (if not all) tasks

Plates:

mvp_1

  • Optional plates like Grease Pencil are shown here with a dotted outline.
  • Tool UI (snap, selected v/e/f, …), proportional editing
  • Object being edited will be drawn in a way that stands out visually and informs the modeling task. Shown here in basic clay for early deliverables.
  • Objects not being edited are shown for spatial context but drawn in a way that diminishes focus on them.
  • Optional reference image (blueprint, face profile, etc.)

2. Camera and staging

“Task appropriate shaders”

Deliverables:

  • Reliable and controllable lights
  • Advanced screen effects (DoF)
  • Flat reflection
  • High quality playblast previews

Plates:

mvp_2

  • Light elements (widgets to adjust scene lights)
  • Probe elements (to setup cubemap reflections)
  • In this example objects are drawn solid (per materials flat color) with scene lights.

3. Texture painting

“Integration with external engines”

Deliverables:

  • Fantastic good looking shaders
  • Smooth pipeline (no need for switching between Cycles/BI to get the right “look” or options in the viewport)

Plates:

mvp_3

  • Tool UI includes painting cursor, brush preview, etc. If preview lighting is used, an extra plate with grabbable light controls will also be shown.
  • Edit Object plate will be one of the variations shown. You’ll be able to flip between them as needed.

Pending designs

We plan to to revisit the overall design every time we get to one of the MVPs. For instance, there is no mention of how the customization of the tasks will be, as well as the user interfaces.

We also need to explore what the other workflows may be, what tasks they consist of, and what technology and techniques we need in the viewport to fully support them.

The idea is to validate this proposal with the deliverables, and adapt it accordingly.

Meet the team

The development will be open for the usual Blender contributors, artists and developers. We will have an open call for developers to assist with specific initiatives here and there, and will welcome anyone willing to work together towards the 2.8 viewport goals.

The current team, their IRC nicknames and their roles in the process is:

Mike Erwin (merwin) is responsible for the low level implementation. This has started already with the OpenGL upgrade, Vulkan preparations, and internal API design.

Dalai Felinto (dfelinto) will help coordinate the work from the developers and artists, and take over part of the core development.

Clément Foucault (hypersomniac) will take over much of the PBR implementations. Part of his PBR branch will be ported over as a Cycles alternative for realtime previews.

Sergey Sharybin (hackerman) will handle all the scene data that will be fed into the pipeline. This will happen in parallel to the new Blender depsgraph.

Blender Studio artists that will be providing feedback and help to validate the design.

Appendix I – Plates

Following are some of the individual plates we anticipate being required. Additional plates can be implemented once the workflows and tasks are better defined in the future.

Effects

  • Depth of field
  • Reflection
  • Color adjustment
  • Fluid simulation
  • Particles
  • Lens flare
  • Custom (GLSL)

Elements

(aka Objects not renderable)

  • Cameras
  • Lamps
  • Probes
  • Speakers
  • Forces
  • Motion path

Shading

  • Clay (solid + AO)
  • Outline highlight
  • Solid wireframe overlays
  • Depth wires
  • Solid color with scene lights
  • Grayscale with scene lights
  • Real-time (GLSL/PBR) with scene lights
  • Real-time (GLSL/PBR) with HDRI
  • Matcaps
  • Raytracer lit (e.g., Cycles)
  • Studio light
  • Silhouette (uniform color / solid black)

Filter

  • All objects
  • Edit object
  • Object types (Mesh, Armatures, Curves, Surfaces, Text …)
  • Mouse over object (?)
  • Mouse over mesh hot spot object (? – mesh manipulator)
  • Outline search result (?)

Misc

  • Simplified geometry
  • No hair

Appendix II – Buffers

Each plate reads specific data from the scene graph and from other plates’ buffers, and writes into its own buffers. The buffers for a typical 3D View will consist of:

  • Scene depth (32-bit)
  • UI depth for 3D widgets
  • HDR color (16f RGBA by default, 32f option in system prefs)
  • LDR color (8-bit RGBA)
  • Object ID for selecting/masking (32-bit integer)
  • Composited screen color (10-bit RGB)
  • Previous HDR color for SS reflections

These details are mostly invisible to the user, and part of what the viewport API should be able to gather from Blender “datagraph”. To quell panic about runaway VRAM usage: each plate does not necessarily have its own private buffer, these buffers cover only the 3D view not the whole screen, and only the composited screen color must be double buffered.

42 comments 30,521 Views
  1. Thanks for such a well thought-out plan! I’m just curious how this plan relates to Hydra? Is there still a plan to integrate it in some way?

    Also, real time hair display is essential for animators to achieve accurate silhouettes in their performance. You note in the Misc that hair is not included in the plan. Is this just for first pass, or is that long term as well?

    Cheers,
    Jason

    • The folks working on Agent 327 said the same thing about hair.

      Real-time hair preview is not part of these MVPs, but will be there in the future. The “no hair” item meant an option for “don’t show hair right now” — sorry for the needless moment of panic!

    • How many hairs does a wolf have google? Now lets multiply that by 50.

      Hair is one of Blenders holy grails. This would be absolutely stella if we could get some bettter feedback in viewport. This so important when it comes to doing client work.

      Without it life is much much slower.

      Will the composiotor be merged with the view port as well? Perhaps similar to nuke? Kinda makes sense when you think about the real time rendering that is going on. Natron?

  2. This is fantastic development news, really excited about 2.8
    As an artist I love reading these blog posts, they shed some real insight into the workings of Blender.

    Keep up the excellent work

  3. That is awesome design roadmap.

  4. I am amazed!The 2.8 will be the beginning of wonder!

  5. While I’m imressed by the leaps and bounds in Blender’s real-time rendering, and are starting to make animations designed for real-time rendering, I’m a little sad to see the end of Blender Internal… while 90% of it can now be handled in OpenGL, there are a few thins like reflections, refractions, and SSS that I can’t see being converted to OpenGL properly. Blender Internal proved it’s self as a quality render engine for the first 3 Blender Institute films and countless other projets, and is still much faster than Cycles.

  6. An open source miracle!

  7. Nothing about textures in Edit mode ?!
    Seriously, a solution for UVmapping, for textures projected by UVproject modifier is needed for default Character Modeling task.

    OK, we don’t need a complete PBR shader to lit the model.
    But often, modeling can be based on texture details.
    Knife for cut-out characters, cheap morphing effects with PET or sculpt mode, etc…

    And again, camera and staging state needs a support for animated textures, dynamic paint effects…

    A shading plate that displays the selected texture, the one that is important for the animation.

    Don’t repeat 2.5x error to forget textures in the design

    • For the Modeling MVP we were thinking of initial shape modeling, pre-texture/material. Of course it’s common to go back to edit or tweak models at any stage of completion, or like you say model *based* on a texture. Including materials in the modeling workflow is important, it just isn’t the *first* thing we plan to get working.

  8. This seems remarkably well formulated, very constructive outline of what needs to be done. I wish the team every success, and a timely inclusion of these possibilities. I hope the timeline for implementation can be made briefer with more resources, rather than extended, as it will be crucial for more wide spread Blender adoption. Thank you all.

  9. Sounds awesome! Have you thought about exposing plates to users via the node editor?

  10. Will there be any focus on a workflow for people creating real time assets for games and such? That has always felt like an after thought to me in blender. Things like rendering and sorting alpha textures in the viewports and custom shader support would greatly improve real time asset creation.

    • Yes, definitely! I’m more interested in the game dev side of things than movie production. We’ll make sure the tools serve both.

  11. Nothing about textures in Edit mode ?!
    Seriously, a solution for UVmapping, for textures projected by UVproject modifier is needed for default Character Modeling task.

    OK, we don’t need a complete PBR shader to lit the model.
    But often, modeling can be based on texture details.
    Knife for cut-out characters, cheap morphing effects with PET or sculpt mode, etc…

    And again, camera and staging state needs a support for animated textures, dynamic paint effects…
    A shading plate that displays the selected texture, the one that is important for the animation.

    Don’t repeat 2.5x error to forget textures in the design.

  12. I think in part 2 “Camera and Staging”, you spelled boomsmash wrong. 😉

    More seriously, it looks awesome on the paper, can’t wait for the first tests, good luck to the team !

  13. How will this affect the bge, Will the effects work in the bge too, or do some not work in the viewpoint.

  14. When are we expecting blender 2.8 to be released?

  15. When are we expecting blender 2.8 to be released?

  16. When are we expecting blender 2.8 to be released?

  17. Interesting times ahead.
    This looks like a golden opportunity to take a deeper look into colour management and for taking care of the some old alpha channel issues that coudn’t be addressed before because of the old code/opengl.
    Are there any specific plans regarding those areas?

    • Totally agree. Plans will get more specific as we get further into development. Those unclamped floating point color buffers aren’t in the design arbitrarily 🙂

  18. I’ll admit, I’m quite worried about the prospect of Blender removing wireframes. Will the ability to view wireframe mode and edit, or ‘see through’ an object with wireframe and select verts be possible? It’s my only major concern because I work with wireframe mode almost constantly with my workflow.

    • unless blender suddenly drops subdivision surface / poly modeling and moves to a completely different workflow such as using nurbs/curve defined surfaces (surface modeling) like the programs Ton was using as an example of a wireframeless interface during bCon last year. you’ve got very little to worry about.

      while those programs do work quite well for hard surface man made objects they’re terrible for making more complex organic or deforming things.

      also, i’m pretty sure pixar would have moved on from sub D modeling if there were a viable and better alternative. open subdiv is pixars thing though. https://vimeo.com/70600180 http://graphics.pixar.com/opensubdiv/docs/mod_notes.html

      mostly i think he was just talking about how dense meshes or scenes just become a total mess if you view wireframes so it doesn’t really help or look clean or even clearly represent the mesh to the viewer.

      in which case i think they just need to rethink how to display wireframes to make things cleaner. i’m sure that’s something that isn’t so easy (maybe not even possible) when subsurf isn’t used but much easier when it is. after all we’ve already got an “optimal display” checkbox in the subsurf modifier settings that turns on/off the display of subsurf wires. adding more options onto that in order to display different subsurf levels differently maybe less visibly or even at all would help quite a lot rather than just displaying all wires the same way i think.

    • Worry not, Blender is not tossing wireframes! Wrote more about it here:
      https://lists.blender.org/pipermail/bf-viewport/2016-September/000208.html

      I’ll do a show & tell at the upcoming Blender Conference.

  19. This is a really nice plan, and easy to understand for someone like myself. This looks like the Texture painting workflow even allows for the shadeless work on images and for capture of them in 3d viewport as we already do in Blender Render through camera if they are single channel – is it possible also for material with mutliple channels? I use the image slots as layers for painting on both 2d and 3d session in the 3d view.

    The whole task oriented workflow is a really great idea, and I will keep watching for the progress as this forms.

  20. but as a user who uses blender for 20 years, these are my priorities:

    – Wireframe colors (by group)
    – filter by object type

    I sincerely don’t get the shaders awesomeness for non realtime things – If it’s for a game, I want it to look exacty as in the engine
    – but for cycles, blender internal, I don’t need any shader preview, I think it is a waste of developers time, trying to mimick renderer features in OpenGL/GLSL. .
    -not sure if it has anything connected with view, but a 3d view where you can paint color + bump + speculars( thus are able to update the view in realtime with thos) and also with several image layers, that’s some seriuous awesomeness that’s missing in blender.

  21. I suggest, that Blender Internal will be externalized. Put it into github as a single project that can be run as standalone app. Then make a good (enough) external render engine support for it from Blender, and then let it go (there is a singalong for that moment).

    After that, anybody who thinks BI is important enough, will keep it alive. Release it, let it go. If it is as great as I have been told, it will thrive as separate program from Blender.

  22. Cant wait to use this, Proudly a blender fan. Was born to blender in circles age tho

  23. Cant wait to use this, Proudly a blender fan. Was born to blender in circles age tho

  24. This sounds great, and I’m really looking to hearing more from bcon next month. I hope that we will be able to configure custom task presets from all the available plates and select them at any time from a menu in a similar way to how matcaps work currently (but available everywhere, rather than just untextured mode). If we could set custom defaults and also have the option to not change shader automatically when changing modes, that would be awesome. A good Python API is also important, so that users can integrate them into addons for tasks that maybe nobody thought of yet.

    I would also like to hear some clarification on what is meant by “wireframes”, since I’m not sure if it refers to the current wireframe only mode in Blender, or just wireframes in general – even when used as an overlay onto a mesh. Edit mode would be completely useless without wireframes, so a lot of users do get kind of worried each time talk of removing them is brought up, which is why clarification is necessary.

  25. I am excited about everything in the project outline, especially about the performance optimizations.
    The details get pretty hard to follow (imho) in this pure text document but we’ll learn more as the project develops. Looking forward to it!

  26. Awesome! Thanks for the comprehensive info.

    By the way, I just made a small donation to Blender Foundation to help the developers that make all these things possible. Please consider making a donation if you use and like Blender! 🙂 https://www.blender.org/foundation/donation-payment/

    Keep it up, Ton and team!

  27. It would be great to have access to separate buffers from the graphics card’s GBuffer for composition and postproduction, during OpenGL Render / OpenGL Render Animation.

  28. I have a question.

    How are you going to handle the displacement maps bake if you are going to take away the internal engine? are you going to migrate the displacement bake to cycles or something similar?

  29. Would be great to be able to rotate around in the viewport if a command like move or scale or extrude is active. Compared to other 3d software I regularly use, I believe this feature add a significant amount of dynamic potential.

    Cheers 🙂
    Dimitar

  30. we are looking at integrating Hydra in the rest of our pipeline (Or Fabric’s Viewer which will probably based on Hydra anyway). Too bad that Blender will not get this, I was hopping that wouldn’t be something we needed to do 🙁

  31. i strongly believe blender internal shouldnt go and should remain part of blender for us who are behind with technology.We can still enjoy its fastness on our p4 systems as with cycles its realy hard even to preview your materials. with the help of glsl rendering its a great feature unless glsl rendering can be implemented in cycles. i speak as an aspiring artist with no technical background so i stand to be corrected. consideration can it be given to us from developing countries the “lagards”(well behind) of technology who ae still hearing of quadcore as a thing of the future.

  32. It’s written “A key point is that plates are drawn independent of each other” but yet it is also written “Each plate reads specific data from the scene graph and from other plates’ buffers”.

    It seems a bit like a contradiction ? So what are the plates depending on ?
    Is it a simple flat dependency system where they can all require the same buffers from the scene (Depth Map, etc,..) or is it a complex dependency system where each plate can also require to read the buffer of another one ?

    If it is the latter plates are not really independent from each other as they can depend on each others’ buffers

  33. One thing that I would love to see is the ability to detach windows so that you can place you menu bars in a secondary screen allowing you to have a full view of you modelling viewport.

  34. what did I read there? external engines for Texturepainting?:D
    can someone discribe this for me? is this an atempt to import some brushengines from krita, gimp or mypaint. Or dir I misunderstood this statement and it is something more lowlevel?
    best wishes monatsend

  35. Thanks for considering various target markets and specialised needs out there on the market, not only Cycles although it is very capable. Although CPU-rendering is great we sometimes have to deal with heavy scenes and being able to approximate the look is an excellent feature if it is so possible.

  1. Leave a Reply

    Your email address will not be published. Required fields are marked *

     

share this story on