Blender Institute had 3 days of coder/artist workshops on 2.8 designs and targets. Having 12 brains going over all aspects of usability and technical concepts has been extremely useful. The team is gathering all notes and designs currently and will make extensive reports for everyone here soon. Some docs will appear on the code.blender.org site, most of them go to wiki.blender.org.
I can happily confirm that we agreed on everything we’ll present. It’s well aligned with what we started for 2.5, very good to have clear focus and direction! Of course we look forward to a public review and high level of support by everyone who’s involved with Blender.
While the real detailed docs are getting reviewed and finalized, I’d like to share the highlights with you in this short summary:
Blender 2.5 concepts
- Design principles of original 2.5 doc were verified and confirmed to be valid still.
- Suggested is to make more use of temporary popups for settings (viewport settings, etc). That clears up the too long property panel lists.
- The select-tool order can be kept working by use of new toolbar that will present tools using new widget/manipulators. Goal will be that most operators will get the ‘redo’ operator as widgets in an editor, allowing users to tweak operations more efficient, but also to repeat it on a different selection. For example translate and rotate, but also spin, duplicate, extrude, knife, etc can work this way.
Layers
- The Scene Object list will become a hierarchical list of Object Collections. Each Collection can hold other Collections and hold all the Objects you want. Objects can be in multiple Collections.
- The new Layers then use (= point to) the Scene Collections to define what’s visible, or editable, or edit overrides (draw types, materials, etc). This way you can set up Layers that use the same Collections, but with different visibility and render settings.
- Those Layers will be used by all editors (including viewports) and render engines and the compositor.
- Visualization of Object-mode and Edit-modes in Blender will be coded as a special drawing engine, composited on top of viewport drawing/rendering itself.
- New configuration files will allow to define “Workspaces”, which will be available as tabs in the top header of a window. Each Workspace can store multiple layouts (old name ‘Screen’), and will allow to filter or change visible panels, tools, menus, active addons and shortcuts.
- Blender’s screens (layouts) will get a default top bar for menus, for active scene, for active mode and for active layer. Scene, Mode and active Layer will be ‘global’ for all editors in that workspace.
- Further workflow configuration will be enabled by efficient use of template files (sets of default assets or primitives) and a project file (settings that need to be set per project, such as file paths or web settings).
- Special attention will go to deliver a couple of low-level input “keymaps” for input devices like mouse, pens, tablets and touchpads. These should be selected based on your hardware setup and preferences (left select, right select). On top of that we make 1 strong (more minimal) default shortcut keymap for all basic operations.
Technical design
- Each Layer gets own dependency graph
- Overrides can be static (on start/load) or dynamic (animated).
- Overrides in two types: to replace values (color, etc), or add/remove items from a list (modifiers, constraints, etc).
- Particles and hair systems will be put back, but as own primitive type. Full proposal coming soon.
Participants were a very diverse group of coders and artist/contributors:
Pablo Vazquez, Julian Eisel, Paweł Łyczkowski, Daniel Lara, Sebastian Koenig, Bastien Montagne, Brecht van Lommel, Mike Pan, Sergey Sharybin, Dalai Felinto, Ton Roosendaal and Jonathan Williamson remotely.
Ton Roosendaal
1 December 2016
How do I become a developer????
Some of the new concepts seem to be really great.
Only thing I can think about is how often will you typically click on a workflow tab? once per… week? twice per day??? Why to have tabs for such a feature, and not a dropdown menu?
This layering solution sounds very limiting when you start to think about render layers. It’s very common for a shot break down to have 1 object appearing in multiple layers. This system doesn’t seem to facilitate this.
One thing that makes Sketchup very fast to model in, is their Inference Engine.Any chance of doing something like it in Blender 2.8 workflow ???
maybe have the GUI organized in some xml (or XUL),so that it can be extended by addons. (thinking of properties window with its fixed buttons and pannels, would be nice if such things could be easily re-organized or buttons added, also by python).
SOLID GROUPs
It would be nice to have groups that act as a groups like in other software.
So if objects are grouped should be act as one object.
first TAB pressing: open group for editing objects inside
second TAB pressing: open Edit Mesh for selected object in the group
exit from Edit Group by deselecting all in the group and TAB pressing ( or selecting an option from menu )
I think it might be useful to look at the beta layout in MODO 10.2, there are some interesting ideas there, like the ability to quickly turn panels on and off easily and the ability to have tabs easily arranged within a single panel, modo has it’s flaws but there are similarities in the interface customization so I definitely think it’s worth looking into it’s strengths. looking forward to see what comes out of blender 2.8, I’m personally considering making the switch :)
The new layer system and the hierarchical collection system behind it seems rather powerful.
Separate depsgraphs could maybe solve the performance bottleneck blender currently has when dealing with many objects.
If the new design would also contain a thing like “collection modifiers”,
this would be absolutely great for use cases where you deal with many objects and want them to interact in certain ways.
Handling a collection of objects under a common modifier would allow inter-object operations on the objects of that collection.
Some thoughts:
-A collection modifier could be valid for a collection and all subcollections per se.
-Child collections or even a set of siblings inside one collection could override the modifier settings with own settings.
-Individual objects could be addressed with a 2 component-address: hierarchy depth and sibling number.
-You could put a modifier on a certain hierarchy level, move it perhaps up and down the hierarchy similarly to a stack-like construct we have now.
-Additionally you could select some siblings (and make even a new collection out of them, but then you might need to address the object in the subcollection)
-to specify the “influence” of the modifier (to which objects it should apply)
Please make layout like softimage XSI, well organised and intuitive and beautiful.
Please be inspired by softimage xsi. :)
My two points:
1.
I like having addons linked to a workspace/layout, but it could lead to annoying lags if one sets a custom scripts folder on a network
2.
I like collections paradigm (though it could be left ‘group’ name), and I would like it even more if they can be “locked” so that selecting an object of a locked collection selects the whole collection, as well as editing and applying modifiers/constraints works on the whole collection. Then proper icons in the outliner should expose this, somewhat like in cinema4d. Problem is how to manage object present in more than one collection
Hi !
Great to hear news about all this !
New layer system (with overrides) is good news.
But what about the dupli_group system ?
Now to link an asset, we use a linked dupli_group and proxies.
It works pretty well but we cannot override anything (shaders, modifiers etc.) except through python handlers, or copy/link dupli_groups objects on scene…
Is there a plan to change/modify this ?
Oh! And please make it so I don’t have to recompile the software just to customize/replace the icons :) The simpler you make it for users to “make Blender their own” I think the faster they will become disciples of of the program. I can’t even use Blender to make art yet but have become extremely attached to the program since I started customizing the theme and key-mappings. I don’t know the psychology of it but I’m hooked!
The workshop was about workflow; that means Blender should help with the job people intend to do with it. And if it doesn’t, we should allow better configuration. We focused first on features that help getting more productivity. Theme colors or different icon sets were not discussed really (aside from it being a lot of work to handle).
If you have icons that suit your work (or pother people’s work) better than the current, please share that or explain where the exact use case is to swap them?
Hi Ton!
It’s a very low priority, I understand. But for me personally icons are a big part of my workflow because I’m a visual worker/learner and icons help me identify things at a quick glance and kind of keep me grounded in the software’s environment.
For example, 3ds Max recently removed all the icons from the animation editor’s hierarchy view because they thought it gave it a “cleaner” look. So now where I used to be able to quickly glance over and identify what kind of value was being animated (float, XYZ, constraint, script, etc) now all I see is a jumbled wall of text.
As a result I feel less fluent in the editor which has definitely made working with animation controllers and rigging more difficult for me.
So back to Blender, for someone like me, swapping in an icon set that was more bold and flat (less borders, shading, etc.) would make quick visual searching faster and keep my eyes from getting distracted. As a result I can navigate faster and just generally feel more comfortable in the workspace. Additionally in my ideal workspace I would accompany a lot of the text-heavy areas in Blender with icons because those visual markers help me remember “where I am”. If the icons were swappable I could tailor them to my preference and share them with others who liked the way they looked without bothering users who like the icons the way they are.
I realize this all might sound kind of dumb to a lot of people and the work might not be worth the reward, but thanks for taking the time to read through my thoughts about it anyway. :)
Anyhow, great talking to you. I have a ton of respect for what you guys are doing and am very, very excited about the workflow project!
This all sounds pretty encouraging! I’m not much of a Blender user currently but I absolutely love the idea of Blender and love following all it’s developments. I’ve tried off and on for about 2 years now to get in to it but the UI/workflow concepts were too frustrating and eventually I’d just give up again and go back to software I’m productive in already.
I hope customization and flexibilty will be the driving force behind the final decisions, because I love how customizable Blender is. Also, really glad to see Daniel Lara involved!
Did the 3d cursor come up at all in the workflow meeting? Is it likely that it will be improved with all the other widget/manipulator changes that have been mentioned?
Whilst the 3d cursor is useful, I do think that it isn’t currently powerful enough to warrant an entire mouse button being devoted to moving it. Maybe I’m missing something, but the process for moving it precisely is somewhat cumbersome (select target vertex or edge etc. then select move cursor from menu), when this could be improved with the ability to snap or move it around like any other object, and I’m sure there are tons more ways it could be improved to make it a more valuable tool. Don’t get me wrong, I think it’s a good idea, but it’s just very underdeveloped considering its level of importance.
It didn’t come up, we know it’s a feature with limitations. The current list of todos is already extremely ambitious. A good designed input keymap can also help solving things.
Thank you for the default bar. PLEASE MAKE IT COLOR RECOGNIZABLE BY LAYOUT. That way when user switches to Weight paint (it could be some color/shade), If it switches to compositing, could be another color, if it´s on UV editing, it´s another color. To differentiate modules (layouts) is key for the new user to understand the many trades Blender is capable of.
Thank you for separating particles and hair. Thank you for all global toggle keys. Thank you for thinking more about UI. Thank you for considering more hardware interaction! (a big plus!). POP UPs to reduce Property bar space: THANK YOU!! SO MUCH! And since all of that is goint into API we can tap into that for other things. Floating windows (pop ups) for holding properties are necessary to arrange (direct access) to settings and tools. Coding that into Bpy is going to be a blast to use! :D
We can make sure that workspaces have a distinct color theme you can configure, but we should be very reluctant using colors with fixed meaning in UIs in general. I think UI designers nowadays agree that a useful/meaningful use of colors in UIs can only be done by using as little of it as possible. Humans can distinguish and memorize the meaning of only very few colors (4 or 5) above that things get messy quick – especially because each UI concept would want to claim the same rights for using colors.
This is one reason why so many UIs are designed neutral (grey/white) with minor color accents.
Another reason for being careful with colors is the fact that Blender is a creative tool. Colors are things that get made by the artists, a UI shouldn’t fight with it! For example choosing a dark or a light theme set already influences the art you make.
Love it all! Seems to be going exactly in the right direction, many points I had already though about myself.
Only week point from where I sit now is that having both “Workspaces” and “Layouts” seems overkill and redundant, from what I could grasp here.
Having just one single, well implemented system for defining tools and layouts would be simpler and quite enough, and possibly simpler to implement/maintain
They may look redundant now, but they’re really not.
A workspace is dedicated to a certain workflow, with a certain screenlayout, but more importantly with your own configuration of panels and tools. Even hotkeys or entire keymaps can be assigned per workspace. You can assign different layers with different render engines per workspace. That way you can quickly change between different workflows that are perfectly customized to your needs.
The screen layout will be just a quick way to arrange your editors per workspace. So you can easily switch from one big viewport to viewport plus node-editor, to viewport with node-editor and uv editor and so on, without loosing your current workspace configuration of tools and hotkeys.
Furthermore, the screenlayouts will not be named any more, you can choose them via automatically generated thumbnail image. :)
Thanks for the clarification, that does indeed sound a lot more useful and powerful too, and feels less redundant.
In that case only remark left is that; my vision may be clouded by my personal workfow but; but I don’t see one switching between Workflows that often to warrant permanently visible tabs on top. Current content/UI screenspace ratio is already crowded so I hope they can be hidden.
Anyway kudos for all the effort, really looking forward to see how this goes :)
About time on the default top bar. One of the most annoying parts of Blender is the overkill modularized UI/UX. `Global’: what a concept.
Looks delicious !
1/ if I understand correctly widget buttons toggle the widget display which is then used to perform the action and tweak it, replacing/complementing the “last operator” panel ? That sounds really good, also seems to solve the longstanding problem of having this panel only in 3DView (provided widgets are available in all editors).
2/ collections effectively replace groups as a sort of more flexible tagging system ? (much-requested nested groups)
3/ concerning particles, I think it’s a good move that they get their own object type. Many systems are tied to objects right now and it seems to make things complicated to manage technically, not to mention ambiguous from a user pov…
However looking to the future and especially towards the infamous object nodes, I wonder how it is all going to work together – when picturing a future ideal workflow where particle sims could be meshed (absolutely random example) which object type would hold the meshed output ? The particle object would feed its meshing result into a classic mesh object, overriding original mesh data ? Just trying to think ahead.
Looking forward to seeing that stuff unfold… :)
Hadrien
In order to prevent spam, comments are closed 7 days after the post is published. Feel free to continue the conversation on the forums.