The Blender Institute (BI) crew has two main tasks; the responsibility for maintaining the blender.org facilities and to initiate high priority strategic development projects – especially the ones that take long-term focus by experienced developers.
A strategic project usually involves complex architecture or design, for which working prototypes need to be developed. Once things have proven to work reliably, verified in public on blender.org, strategic BI projects are handed over to the modules and developer community.
Last year the emphasis was on projects that multiple developers could work together on. As a result, in 2021 the BI team delivered:
- Asset Browser
- Cycles overhaul (known as Cycles X)
- Geometry Nodes
- Library Overrides
For the year of 2022 the BI team will further integrate this work in Blender and get community involvement in the modules.
Here are the strategic topics on the 2022 agenda:
- Application Templates
- Overrides
- Physics
- Texturing
Application Templates
Blender already supports Application Templates in an experimental state. However, users can’t easily and properly create their own stand-alone experiences powered by Blender. The authoring pipeline features, deployment and future-proof/compatibility still needs to be figured out. A few application examples may be developed as use cases.
Overrides
The collection and object proxy system was fully replaced by Library Overrides in Blender 3.0. The new system exposed some long-standing complexity problems in Blender. This affected particularly the resync performance. The goal of this project is to wrap up the overrides project, assess the future of the override pipeline, address related data management issues and to hand it over to the module.
Physics
The next milestone of the Geometry Nodes project is to support physics dynamics. The first step is to handle interactive physics simulation, and to find a solver that can work in real-time. This can support both point based particles as well as hair nodes, to completely replace the old simulation system.
Texturing
The combination of node-based textures and mask painting will help towards a better non-destructive painting pipeline within Blender. Brushes management and performance are related topics that may be visited as well.
Next Steps
During January and February a series of workshops will take place at the Blender HQ to better define those projects, their expected outcomes, audience and possible solutions. With this on hand we can start to think in terms of teams, milestones and priorities.
As someone who does a lot of procedural texturing what i would love to see is a dedicated curvature/ edge mask node. Also a better way to make good looking scratches would be neat.
Combining all noise types into a single node would also simplify the system and it could just have a drop down menu with the different types of noise, so when you choose one it changes the different variables and input/outputs. More noise types would also give more freedom and variations.
I’m really excited to see what happens though, there’s so much potential with procedural materials.
1. we need GPU armature skinning is armature modifier is last in the stack
2. we need true GPU displace in eevee
3. eevee rewrite looks ultra spicy / I can’t wait :P
Seeing more and more GNU/Linux distros moving towards making Wayland protocol default, I tried out Blender on both Ubuntu Wayland session, and Arch with Wayland sessions, Sway.
Having issues with nvidia proprietary drivers, I ran with the nouveau drivers (sub-optimal) and ran into many problems with Blender, both slow viewport, and constant crashes during rendering.
Some buffer crashing with nouveau. I know it’s probably not Blender’s fault at all. But I’m thinking, when Blender moves towards a Vulkan back-end (great!) and many Wayland Desktop Environments such as Sway also favoring a Vulkan back-end development.
There would be great to have this addressed by Blender developers, the future targets of Vulkan back-end and how to get it up and running under Wayland sessions. I think the work needs to be done sooner than later, in regards, to many GNU/Linux distros now already moving into wayland sessions as default.
Thank you for your hard work.
I just thought I would let you know that Wayland support has been worked on in Blender and changes necessary to get it working currently exist in the Blender code base. However, it is still in an experimental state and is disabled by default. If you build the Blender source code yourself you can enable Wayland support and experiement with it.
When will caustics get into master?
The MNEE patch is planned for Blender 3.2.
We know that it has been few rough years with COVID but it’s sad that the animation project keeps getting pushed back every year, this is probably one of the most important project that blender should put more focus on and do a serious overhaul especially in terms of tech & workflow.
I have faith that you guys can deliver a killing product that would put blender on the big map..hopefully there would be no more delays in the future as you guys figure out a way to do stuff around this pandemic because Covid will be with us for a very long time if not forever
I think Blender should have a (decent and solid) particle system (like Houdini.. or any other 3d software). I don’ t think it’s a not so relevant missing thing. Geometry Nodes opened a lot of new and awesome possibilities.. but there’ s still the need for a decent particle system well integrated with all the rest (geometry nodes included… and supporting EEVEE too).
Anyway.. I would have hoped to see, at least, as 2022 goal.. making the ” particle info” node (totally useless if you use EEVEE) working a bit with EEVEE too.
I can’t understand how a parameter like PARTICLE AGE should depend on the rendering engine used. Shouldn’t it be derived from the simulation itself ? (#CURRENT_FRAME – #PARTICLE_STARTING_FRAME) / #PARTICLE_LIFE_DURATION .. with valid values from 0 to 1.
(I think also all other parameters of the node should rely on the simulation itself and not the rendering engine.. so I can’t understand why after 3 years we are still at this point ;_; )
So.. I don’t believe in miracles (like a new, modern, robust and decent.. houdini like.. particle system.. well integrated with all the rest) but… for 2022.. Blender Foundation… made accessible AT LEAST the AGE parameter of the particle info node to the EEVEE engine, PLEASE…
Hi there!
What happened to the particle nodes?
Thank you for you hard work in sharing knowledge.
Hi, the particles nodes were put on hold and the team (me included) decided to focus on geometry nodes instead. The project hasn’t been abandoned though.
It would definitely be nice to be able to texture paint without having to first create blank textures, texture nodes, mix nodes for each channel, etc. You should just be able to start painting and all of that happens under the hood. Also, the baking workflow for UDIM textures needs a complete overhaul. It’s a complete nightmare the way it is now. Not even worth the extra time it takes.
In two sentences, texturing should work fast, it should handle 8k painting simultaneously on many layers at once with one brush or clone tool.
So you can choose brush or clone tool and paint it on many layers in the same place at once (for example on albedo, roughness, normal, bump).
Ideally would be to be able to choose different brush for every of those layers and paint in the same place or even in predefined different place for every of those layers and paint them with those settings in the same time at once.
It will be super cool to have particle brushes similar to substance painter.
Geometric nodes are currently similar to clang as process-oriented, relying on mathematical functions, and hopefully more objects high in line with the designer’s thinking.
Looking forward to the potential Override improvements.
Currently most of my rigs are absolutely useless as I have to wait for way over an hour to re-open scenes where I have them linked in.
Will there be a render queue function?
Keep up the good work!
By Physics do you mean particles as well?
This is yet to be seen. There is a more immediate need to modernize Blender hair tools and integrate it (and geometry nodes) with physic solvers. A new particle simulation system (akin to the experiments that preceded geometry nodes) may be a better target for the future.
Can geometric nodes be made more object-oriented so that they are more acceptable to the public?
There are no plans to change the geometry nodes design. But I don’t know what you mean by “making geometry nodes more object-oriented” (and how that is preventing you from using it).
Maybe he means as inspired from C++ programming for object-oriented, working object hierarchy in nodes so it is better to use or easy to make complex with families nodes… example a type of group for stones, a type of group for sands and type of group for trees so they can share by « inheritance » as named « forest » base and then collides between objects and make them freely in which on place. I think what he says for object-oriented is good thing for management of complexity (for beginner as well to advanced) without to lost from a lot for nodes because we can « see » hierarchical family as direction to final. Hope that help you if that is it.
Perhaps one way to approach this would be leveraging library overrides *within* the Blender file in an intuitive way. Just thinking “aloud” here, I could envision something like this:
* Defining a node group implicitly makes that group a library, available within the .blend file and by linking from another file.
* Every node group has an optional attribute (default “None”) that links to its library parent (again, local or linked). If this attribute is empty, the default behavior is identical to today.
* If the parent library link is present, the Group Input and Group Output pins of the parent library are automatically inserted to the new group and have their parent’s functionality by inheritence. The new node group can make only limited changes to the definitions of these pins, as noted below.
** If the child connects to an inherited input, this is simply using the same data with new node logic and does not alter the input’s metadata.
** If the child connects to an inherited output, this is an implicit override (handled internally as a Library Override) of that output pin.
** Define a new Field node input called “Inherited” or “Super” or “Base” with exactly one output. This output, semantically (but perhaps not literally) a callback like other Fields, returns the output from the parent library node group and determines which output to return by context (again, like other Fields such as Position).
** If the “by context” part of the preceding is too complex to implement (or counter-intuitive for users), the “Base” node could have all the output pins of the parent library node group, and it would simply be defined as “evaluate the parent node group with exactly the input data from this instance of the child.”
** Input and output metadata can be altered only in specific ways, to wit:
*** The description (tooltip) can be overridden freely.
*** The default, min, and max values can be overridden (my initial thought is that input limits could be overridden freely, allowing child groups to handle wider data variations if their logic supports it, but output limits could be overridden only to make the ranges narrower, so full polymorphism is guaranteed).
*** The data type of inputs can be promoted (e.g., an integer input from the parent could accept a float in a child, but not vice-versa, again to ensure polymorphism).
*** The data type of outputs can be demoted (e.g., a float output from the parent could be restricted in the child to emit only integers, which would not violate polymorphism).
** New inputs and outputs can be freely added in the child node group.
One complex issue that I am not sure how to handle would be ensuring the child behaves gracefully if the parent’s inputs or outputs are altered. I haven’t looked at the code for Library Overrides, but my supposition is that the Blender team has already contended with this issue and hopefully could apply the same pattern.
It’s entirely possible this pattern has already been considered and rejected, but if the team thinks this is worth exploring I can write it up more formally and post on the Right Click Select forum.
On a tangential note, would there be any possibility of adding an Add Menu metadata item to node groups so they can be organized into submenus under Add, without the need to create a plugin with Python logic? For example, if I create a node group called “Curve to Vine” for procedural foliage, and populate this new value with “Syscrusher/Foliage”, then the node group would be added with “Add…Syscrusher…Foliage…Curve to Vine”. This would allow user-defined node groups to behave more like native nodes and would facilitate organizing large reusable node libraries.
Thanks for listening, and thanks for the *spectacular* work that has already been done on Geometry Nodes (my particular area of passion) as well as the many other improvements. Blender 3.0 is a tremendous accomplishment.
Texturing huh? Good time to bring up this proposal again. https://blender.community/c/rightclickselect/hncbbc/?sorting=hot
What is the status of these projects:
1. https://code.blender.org/2021/10/gsoc-roundup-episode-four-lets-get-physical/
2. https://code.blender.org/2021/09/gsoc-roundup-episode-three-ahead-of-the-curve-on-the-cutting-edge/
Here: The Curve Pen Tool
3. https://code.blender.org/2021/09/gsoc-roundup-episode-two-editorial-endeavours/
Here: Pack Islands to Box Area, Snapping Improvements.
The “New grid types for the UV Editor” seems to be in 3.0 but there are no Release Notes for it.
The Release Notes need a coordinator who checks if all of them are moved over from the Weekly Updates. Otherwise the changes are not communicated accordingly and at the release time they disappear or are none existent for most of the users.
There are some developers who don’t seem to be active anymore. It would be nice if this could be somehow communicated. So maybe someone could pick up their work and finish it: Manuel Castilla (manzanilla), Pablo Dobarro (pablodp606) and @Sebastián Barschkis (sebbas).
Not every developer posts the link to the work they have done in the Weekly Reports. Especially Brecht Van Lommel (brecht).
For the Curve Pen, see: https://developer.blender.org/D12155, it is still under development, though the last change was a month ago.
I am also missing the UV Offset operator mentioned here: https://wiki.blender.org/wiki/User:Sidd017/FinalReport#UV_offset_operator
new physics solvers – have you planned using any open source libraries/projects or made any research from different options like: SPlisHSPlasH – most ready solution (by
Interactive Computer Graphics which has many others open source simulation engines/solver projects), libWetCloth, Fluid Engine Dev – Jet, PROJECTCHRONO, DualSPHysics, PhysBAM, ZENO node system, Taichi Elements, Crowd Dynamics (by Aalto university), Vadere (crowd simulation).
Or are you gonna develop solvers from scratch and by using open technical papers to make solvers work interactively with each other – liquids with density and viscocity, rigid bodies with density two way coupling / floating with liquid, or softbody/cloth floating with air/gas/fire? traffic / crowds with agent based “ai”/rules, how about gas and water interaction (“bubbles” / gas that go through multiple variety density and viscosity liquid layers and bubble explode on surface and release gas and maybe even set on fire after all because of heat, density and pressure change) or what kind of interaction level there has been designed? Any need for ideas/test scenarios that would be helpful to design full-featured simulation framework?
And how about layered simulations?
Also maybe blender foundation should consider contact to people / developers like this:
https://devtalk.blender.org/t/realtime-gpu-smoke-simulation/13142 looks very interesting project.
I have the impression they don’t have the right people in their development team to constantly improving their simulations. This part in Blender has mostly minor tweaks.
Would be cool to find a way to improve this in the future.
As for the Physics please consider https://mujoco.org/ It’s open source now under Apache License 2.0.
Killers. you guys are VERY GOOD!
Thank you for the hard work!
Another thing I would love, The ability to have mantaflow inflow objects be able to target an object.
FlipFluids has this feature and it makes things like hoses and water guns easier, because the inflow object always points the inflow toward the object even when parented.
It would remove the need for the xyz velocity control and having to create complex drivers for something so basic as having a character shoot a water gun, or shoot fire or gas.
Is the geometry nodes physics plan to integrate cloth, softbody, and rigid body?
The reason I ask is that blender back in the 2020 google summer of code had a project to overhaul the softbody physics system to be fully volumetric, and it appears that that overhaul is dead.
I would assume it would make sense to integrate a volumetric softbody engine, as it would integrate well with mesh deformations, particularly with character rigging. This would make alot of character animations easier, especially with jiggle and squish, and where the cloth sim would be inappropriate (namely because cloth is more for open flat meshes, like cloth). I myself have had loads of cases where I wanted a character to have fat or muscle or loose skin shift and squish, or have a character sit or press against something and have them deform appropriately (like when someone is taking a punch or sitting in a chair or pushed against glass).
I also know that this would require apparently some overhaul to remesher, to allow for TetGen, to accurately dice the mesh appropriately.
Sorry for the long post, It just gets annoying when I have to use/develop so many workarounds, or use Houdini just to do such a simple task.
Very exciting! Hair and groom improvements are at the top of my wish list. Has there been any discussion about incorporating a new groom system and tools with geo nodes particles?
Yes. You can read the latest hair design document here:
https://wiki.blender.org/wiki/Modules/Physics_Nodes/Projects/EverythingNodes/HairNodes
It doesn’t cover grooming much. But it lays down the foundation for the hair tools and nodes.
Is Eevee SSGI still planned, and just not big enough to make the list, or is it getting pushed back? I remember someone mentioning something about Nishita skies for Eevee to a while back, although that one probably IS too little to make the strategic targets list.
EEVEE’s rewrite is still ongoing. You can read more about it here: https://code.blender.org/2021/06/eevees-future/
Any targets for updating compositing? It lacks quite a few basic items that would make it be a considerable alternative to Nuke/Natron.
Proper mask painting for one but obviously proper colour spaces are just non existing (e.g. ACES).
The most important is speed though, even with the new full frame option it just takes ages to manipulate a 4-6K image.
Thanks
Hi Sam. The compositor development is a bit halted at the moment. Community involvement would be welcome in this area. The fullframe project can still go a long way.
On a side note, if you are interested on real-time viewport compositor you can subscribe to this task to follow its development: https://developer.blender.org/T91293
What are the plans for sculpt mode is it abandoned again because Pablo left?
I don’t think that is fair for a module to relay on one person.
I second that question – sculpting/painting?
Yeah? why did he left, something happened between him & blender devs,He’s now developing an awesome retopo app for ipad which should have been for blender, what a shame.
Your ‘Strategic Targets’ are a very good choice.
I wish the developers good luck in this important project for a good result.
Thank you
c++ support for addons please
Are their any plans to update input device support?
I’d really like to see a hybrid input workflow implemented using various devices for access to different areas, for example:
NDOF Devices or multitouch gestures used for pan and zoom, with the rotation origin defaulting to the view origin, and automatically switching to selection centroid.
Stylus for grease pencil, selection mode, and brushes for sculpting or texture painting
Touchscreen for manipulation of 1d and 2d windows.
I think this would significantly improve some workflows in speed, efficiency and productivity.
There are developers from the community working to improve the touch gesture support (for devices like cintiq tablets). It is still too early to tell how this will help workflows like what you suggest. But it is the first step on this direction anyways.
this update sounds great. Does this mean that the hair system will be integrated with the geo node in 3.1? And will there be any improvements with hair systems?
Definitively not in 3.1. But the plan is indeed to integrate the new hair object type (which is experimental and hidden from final users) with geometry nodes and support node-based hair braiding and creation. There is still not even combing mode for the new hair object type, so plenty of work ahead.
Does this mean the animation project is being pushed back again?
Yes. See my reply to Zadek a few days ago.
Texture painting is underwhelming. With a few improvements it will be vastly superior. A bit of painting layers, a bit of brushing and control. A bit more like a painting program.
Amazing plan. I’m especially excited about the evolution of the texturing system as its core to everything I do in Blender. Would definitely love to see some new capabilities and workflow improvements realized on that front and excited to see and try out what the amazing blender devs can come up with.
Still no C++ API. Sorry to say but you are missing a trick here.
Blender will explode with 3rd party plugins when it finally gets a C++ API.
I don’t think that’s going to happen but I think it should!
“The first step is to handle interactive physics simulation, and to find a solver that can work in real-time.”
Does this mean that this is for smaller-scale stuff and later down the line it would get scaled up for more complex stuff which can’t be done in real-time? First, there would be a “game engine” level complexity in simulations, and then in a few months/years, there would come support for the “VFX” level of complexity?
Either way, it all sounds exciting :)
The main difficulty of working with materials is masks for textures. Creating three, four or more masks greatly piles up the screen with nodes and makes it difficult to work. I like nodes for complex projects, but I would like to see an alternative with layers for standard projects with masks and textures.
There could be a new node that could support layer with the new way attributes were designed, could make this easy.
“There could be a new node that could support layer with the new way attributes were designed, could make this easy.”
That is a nice idea. Nodes that support layers. Sort of an app within a node system.
I would also like to see like i saw in Softimage XSI ICE addons/plugins based on nodes. I think today Nodes are too low level for most artists. The ideal level is the 3DsMax Particle Flow, many people criticize 3DsMax myself included and Particle Flow is legacy with not even multicore but the Nodal language level is the best compromise i think between artists and more technical artists. Look at novel Tyflow for a more modern approach.
Thank you for not forgetting about the texturing mode. I often see on YouTube that a huge number of people after modeling are already texturing in another most famous program. And I can see for myself how much this mode is limited compared to that most famous program.
Excited about the texturing project. Really hope the team will aim for a more substance like workflow. Currently working with pbr textures in blender is quite time consuming and tedious.
Would be great if the baking process could see some love as well. I regularly need to export stuff out of blender to game engines, and have to use two different addons to make that work. Would also be great if shaders could be baked automatically when exporting the model with a format that supports including textures (e.g. gltf).
Excited to hear texturing is going to get some love and attention. It’s desperately needed, the texturing workflow in Blender has barely changed since pre 2.70 era. Hopefully we can gain the ability to perform substance-like painting of multiple PBR texture maps into multiple texture slots in single brush strokes. Painting decals of details onto surfaces quickly would be a dream come true.
As for application templates, I couldn’t wait, so I’m already using the system, I can tell it’s not meant to be used yet but I had no issue squeezing my own templates into Blender. And it’s been very useful. I have templates for common types of renders I need to perform for product visualisation, and it definitely speeds up my workflow being able to open Blender, pick a template from the splash screen, and have all the lighting and compositing already setup to my usual methods.
A new physics simulation system does sound very fun, although not essential for me, however I don’t do much animation so that’d be why.
I only hope that there may be some time this year to squeeze in some upgrades for baking in Blender, especially since texturing is already being looked at. At least some workflow improvements, any would do, no matter how small. I’m sorry but there’s no kind way to describe it, right now baking in Blender is a horrible mess and basically not worth even using unless you have a decent addon to improve the workflow. Improvements for baking have been on the drawing board for many years, hopefully some small improvements can be made this year.
I’m looking forward to the physical engine update.
does this mean things like particle nodes and things like it will be merged/developed on top of geometry nodes?
Yes, the idea is to integrate them with the geometry nodes system.
Is this everything that’s planned for 2022? I thought the 3.x roadmap had a lot more planned.
Hi Timothy, the 3.x roadmap also includes the work planned for the modules and online community of developers.
I really hope that sculpture work will be added to the list. Especially against the background of the latest news. People will try Blender as an alternative. This is a great opportunity to introduce not only new users, but also titans of the industry to a possible alternative.
There is a lot of development planned for sculpting in 2022. Joseph Eagar will share more details on this soon in a blog post.
Any links to where do you got this info from?
Dalai is part of the dev team lol
So is the Animation 2022 not happening?
From the 3.x roadmap article:
“Blender Institute will lead this project by involving industry veterans for design and implementation.”
It is still not possible to fly-in external contributors for long workshop and development sessions here due to travel restrictions. Until things normalize again, the focus is on the projects the existing development team can work with the Blender Studio and the online community.
Thank you for the answer and explanation.
How about all the things in the box from years ago, still need and wait for an polish…:
Keymaps, Theme color management, apple magic mouse (if draw tablets are working on linux blender so why not apple magic mouse. Apple entered blender world !!), etc…the old bugs list becomes longer and longer… keep this in mind.. won’t get better without doing something…
Maybe a task force to fix all the little basics. You know the basics from where/with you start from, and if the start is…
Thanks in advance
Hi,
A lot of what you mentioned are welcome module development targets. Bugs and polishing is indeed an integral part of their (modules) ongoing agenda. Some topics, like keymanager, are a bit more complicated. The required amount of design makes it less likely to be handled by a module (user interface). It can eventually be prioritized as stratetic, but for now the focus for the institute are on projects will more unkowns and bigger impacts.
It is important to always reiterate that the more contributors join the project, the more activity can happen in the modules. To help with this one of the targets of this year for the development team as a whole is onboarding, documentation and better tooling.
Sounds very exciting, I’m especially hyped for the physics that might make interaction mode a reality someday
I hope we also get the animation 2022 overhaul started at somepoint in the year
That would be icing on the cake
Could we see some path based procedural shader assignments at some point ??
And finally a nodal render management context ??
Let’s modernize the obsolete workflows before anything else is added.
All modern prods work with nodal procedural render management nowadays (for the past 5 years at least)…
Would love to see that shapekeys could also change the bones of the armature.
Meaning that you could bake/freeze the bone position and length into the shapekey.
Hi, I suggest you attending one of the Animation & Rigging meetings. There you can make your case, and hopefully help with the design, testing and development.
Check the module page for more information: https://developer.blender.org/tag/animation_rigging/
Water nodes please! It would be lovely to be able to choreograph water feature jets with LED lights.
I’m hoping the particles and physics will include plenty of options for fluids (especially water)
pero un dia tendremos fisicas en tiempo real y blender podra ser usado con su modo interactivo en realidad virtual con un eevee que pueda poner mas samples mientras mueves cosas
espero que texturizar finalmente agregue el puntero segun los normales como en sculpt mode
As always, thank you for your efforts! I’m looking forward to seeing more advancements in Blender’s sculpting feature this year as well.
Fantastic news.
In the beginning, the focus is on a generic framework and particles and hair. Are the other physics systems also getting moved to node based?
Hope you can do the workshops in feb without many restrictions.
Hi Silas, it has always being the idea to integrated the different physic systems. At least to have the output of a simulation to be the input of another simulation system.
It may not be possible to have all the simulation systems created with nodes though. At least not initially.
what about the sculpting mod?(((((
Hi, the development of Blender is not restrict to the strategic targets. If you take a look at the work done in 2021, and the main targets at the Blender Institute that should be clear.
In fact there is a lot of development planned for the sculpting features in Blender in 2022. At the moment there is a developer receiving a development grant. He is focusing first on merging sculpting vertex colors. He will soon write a blog post about his development targets, design and planning.
that’s awesome news, besides holidays depression i was dealing with pablo dobarro taking a break, since “when is sculpt+color be implemented” is my favourite question every year since i switched to blender. im so excited for this im not even depressed anymore.
In order to prevent spam, comments are closed 7 days after the post is published. Feel free to continue the conversation on the forums.