Layered Textures Design

Texturing is one of the strategic targets for 2022. The first part of the project involves upgrading 3D texture painting performance, quality and tools, and is planned to be worked on in the coming months.

The second part of the project is a new procedural and layered system to create textures, aimed at PBR shading. That part of the project will start later this year, and we are sharing the initial design now for feedback from the community.


Texture Datablock

The design revolves around a redesigned Texture datablock. This datablock contains texture nodes and a list of channels it outputs.

A typical set of channels for PBR shading are base color, roughness, metallic and normal map. The system is not limited to that, and arbitrary outputs are possible for different types of BSDFs or use in other contexts like brushes and geometry nodes.

Texture Layers

The texture properties for a texture datablock show a texture layer stack. Procedural textures layers can be dropped in from the asset browser, and new image or attribute layers can be created to be hand painted.

Panels in the texture properties editor: layer stack, properties of the selected layer and modifier, and list of texture channels.

Layers work similar to typical 2D image editing applications. Blend modes, masks, reordering, hiding, merging, and modifiers would be supported. Selecting an image texture or color attribute would enable painting on it in the 3D viewport.

A difference is that each layer consists of all the channels defined for the texture (or a subset). Blending, masks and modifiers simultaneously affect all channels in a layer.

Texture Nodes

The texture layer stack corresponds to a node graph, and the node editor provides an alternative view to edit the same texture. This can be used to create more complex node setups that can’t be expressed in a stack.

Texture nodes corresponding to the layers in the mockup above.

The new texture nodes are mostly shared with shader nodes. New additions are layering nodes, and nodes for effects that are not practical at render time.

The set of available nodes would be:

  • Common nodes like Math, Mix, Image Texture, Noise Texture
  • Shader nodes like Geometry, Texture Coordinate, Attribute
  • Texture specific nodes like Blur, Filter
  • Layer node to bundle multiple channels together into a layer
  • Layer Stack node to combine layers
  • Texture node to link in an existing texture datablock asset

There is a new layer socket type that combines multiple channels. Nodes like Mix and Blur work on multiple socket types, including layer sockets where they perform the same operation on all channels.

Baking

While textures can be fully procedural, baking is an important part of this design for multiple reasons.

  • Exporting PBR textures to a game engine. There would be a workflow to quickly bake all relevant texture channels down to image textures easily.
  • For textures with many layers, baking is important for efficient rendering in Cycles and Eevee.
  • Some nodes like Blur, Filter and others require baking to work at all, as they can not be implemented efficiently or at all in renderers.
  • Baking procedural texture layers into an image texture or color attribute to continue hand painting.

There should be an easy way to bake all textures in a scene, for users as well as exporters and renderers that need them.

Materials

Using texture channels in a material is done by adding a Texture Channels node that outputs all the channels of a texture, and linking those channels to the corresponding Principled BSDF inputs. Most of the time such node setups are automatically setup as part of a default material.

Shader nodes linking texture channels to a BSDF.

The Texture Channels node has the following settings:

  • Node inputs, created in texture nodes with a Group Input node (similar to geometry nodes). This makes textures customizable with parameters, attribute names or image textures.
  • Option to evaluate the texture channels Procedural or Baked.
  • If Baked, a link to one Image datablock that channels are baked to.

Baked Images

The Image datablock would be extended to contain a set of texture channels. With multilayer OpenEXR files this comes naturally, all texture channels are saved in a single file similar to render passes. For other file formats there is a different filename for each texture channel, or multiple texture channels are packed together into RGBA channels of fewer files, as is common for games.

When showing an Image datablock in the image editor, a menu lets you flip through the texture channels, exactly like render layers and passes.

Baking multiple meshes or materials into a single image becomes straightforward. Link the same Image datablock to the Texture Channels node in all materials, and Blender automatically bakes everything together, assuming the UV maps are non-overlapping.

Assets

Texture datablocks are available in the asset browser for easy dropping into the texture layer stack. Blender would ship with a library of procedural textures that can be used for quickly creating materials.

One complexity in the design is that texture datablocks are aimed at multiple use cases, including different types of materials, sculpt or texture brushes, geometry nodes, or compositing. For this we need some method of filtering textures for the current task.

In the context of a PBR workflow, materials and textures are almost the same thing. In such cases having both a list of material and texture assets seems redundant. However in Blender we need to accommodate more workflows, and so we likely can’t hide this distinction from users.


Feedback

This is an ambitious design that affects many areas of Blender. We welcome feedback and ideas to improve it. There is a topic for design discussion on devtalk.

A few questions we are thinking about:

  • Is making a single texture datablock for many uses a good idea? Or does it make the workflow too fuzzy? Is there a better alternative? If not, how would filtering relevant texture assets work exactly?
  • Are there better ways to integrate textures into materials than through a node? Is it worth having a type of material that uses a texture datablock directly and has no nodes at all, and potentially even embeds the texture nodes so there is no separate datablock?
  • With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?
  • What is a good baking workflow in a scene with potentially many objects with many texture layers? In a dedicated texturing application there is a clear export step, but how can we make a good workflow in Blender where users must know to bake textures before moving on to the next object?
  • Some textures can remain procedural while others must be baked to work at all. How can we communicate this well to users? Is it a matter of having a Procedural / Baked switch on textures that can be manually controlled, or is there more to it?
  • How do we define what a modifier is in the layer stack? Conceptually this can just be any texture node that has a layer input and output, but how do we find a set of such nodes and node groups that work well in a higher level UI? Possibly some asset metadata on node groups?
  • Can we improve on the name of the Texture datablock or the new planned nodes?
70 comments 37,236 views
  1. It’s not texture layers that are needed that can already be done with bsdf nodes you’re adding nothing new. It’s material layers that are needed. For example actually there’s no way to add a refractive shell over a diffuse material, if not by adding a geometry shell with its own refractive material.

    https://bitbucket.org/Diffeomorphic/import_daz/issues/780/better-layered-images-lie
    https://bitbucket.org/Diffeomorphic/import_daz/issues/558/refractive-shells-dont-work-and-solution

    • Yes. This guy gets it. We need material layers please.

    • I don’t believe what you’re describing as Material Layers is actually a “Material Layer”. If you want a material to show the underlying material that is underneath it, physically speaking it’s another object above the one you’re actually rendering, thus requiring this geometry shell.

      This can be simulated for some of the material properties through clear coating, but as you start to accumulate layers you’re better off by having the extra geometry describing that extra layer. The math behind adding these different shaders together does not translate effectively or efficiently the intended behavior you’re expecting it to have.

  2. Actually I take back what I said. I don’t care if these new features are familiar to Substance. Blender usually does things differently and it ends up being way better, so I trust that process. I’m just hoping for feature parity and performance parity.

  3. Just taking the phrase “everything nodes” for a minute to the next level:

    Texture based and texture related nodes can be inserted everywhere. In this context we use generators (noise and other patterns, pixel and vector shapes), loaders (for existing texture maps), and operators/modifiers like UV-mappers, selectors, blurs, math nodes, mixers etc.
    We generate and manipulate 2D images, as well as 3D textures.

    The fun starts when I create a scene node, plug some object and light nodes into it, and feed it to a render node. Its rendered output becomes part of a texture node network for an object in a different scene node with completely different lights and render settings. This lets us mix and remix 2D and 3D as needed, with as many scenes and objects needed.

    Texture nodes just become one group of many others, and they all can be used where needed.

  4. 1) There needs to be many more built in patterns. For example Inkscape has had http://en.wikipedia.org/wiki/Wallpaper_group as a basic ability for literally decades but Blender has only bricks, checker boards and much better noise. Repeated patterns with modifications are very common.
    2) Do you need to have a blender controlled functionality? Could you work with Krita people so their app produces the brush engine and strokes to create something which comes into Blender? If you don’t, will you be wasting time, money and energy reproducing the paint strokes?

    • You usually cannot just copy paste code from one software to another so you have to rebuild it anyway. Sure you can look at how it’s implemented in Krita to get an idea how to do it in Blenders Code.

  5. Maybe about the shader and texture node Maybe you can try and mix shading and texture painting.

  6. This is amazing. The ideas you’ve presented here look really good so far. Here are some other thoughts and feedback I have.
    1. Consistency, intuitiveness, and familiarity. I’d like to see this work in a way that is familiar to users of Substance and Photoshop and factor in how it would integrate with the Texture Paint workspace in a manner that is familiar to Substance Painter. This would include consistency across the whole software, such as the current Mix Node inputs being changed so the top input is the top layer, as an example.
    2. I’d like to see an ever-expanding built in set of procedural patterns/textures that can be used to build anything out of. And maybe overhaul the current Texture Properties tab, that often feels out of place, with this in mind.
    3. I would like to see the entire baking procedure reworked when these new baking features are added, perhaps included in the new overhauled Texture Properties mentioned above? The idea of randomly selecting an Image Texture node within the Shader Editor and then hopping out to the Scene Properties, changing rendering engine and baking to that randomly selected node, etc. is messy and unintuitive.
    4. Perhaps there should be a new workspace entirely dedicated to these texture generating and baking workflows? Substance Designer, Substance Sampler, or a simple but popular program among Blender users called Materialize could be inspiration for this new workspace.
    5. Baking and texturing often goes hand in hand with preparing and reworking assets for proper topology, especially for game development that you mentioned in your post. With this in mind, I’d love to see the Quad Remesh get upgraded or perhaps you could even buy the code from the popular QuadRemesher plug-in which doesn’t get updated much, but still remains crucial to many Blender users.

  7. One important thing we use in our “layer”-style shader graphs now are multiple Material Output nodes for various channel outputs (like SPs drop-down in their 3d & 2d windows). This lets us click a “Roughness” output node in a material and preview the results of that “channel” as an emissive value. Ideally we could get SP-like channel selection similar to how render passes are selected in the 3d window viewport settings. We use another material output node to quickly preview/use baked results for the entire matrrial.

    While traditional layers seem nice, we do a lot that I feel is easier (or only possible) to do as nodes (such as mask reuse with various color ramps or custom math). SP I think tries to solve some of this with what they call “anchors”, but it often times results in an abundance of layers (vs maybe a sub-graph in a node-based approach).

    One thing we were hoping for initially was custom socket types (structs) to group channels (color, roughness, metallic, height,etc.) into a single noodle. This would clean up our current graphs, but we’d still need a way to split the channels apart occasionally.

    Finally, one thing our custom Layer node does when blending two types of materials together using a mask is to treat the new material layer as if it was at a specified +/- height offset with a setting to specify the height where the new material begins. Think scratches in painted metal revealing the metal. The point being that a standard, Photoshop-like blend-mode based layer system would likely not provide a way to implement this very useful custom feature.

  8. Personnaly i support that buut i dont want to blend shade and create pbr and 9ther think i dont like design is more easy for beginner to have color with color

  9. Exciting news! In regards to the talk about improving the ability to create procedural textures and materials like that of what Substance offers, having the ability to create and bake them out as seamless to be used in-game engines, etc would be incredible. Right now there is no provided way to bake out procedural textures and shaders generated from within Blender as seamless. Providing more nodes and a nicer workflow for generating seamless procedural textures like Substance Designer would be an incredible ability to do right from within blender.

    Thanks for your hard work!

    Best Regards

    Anthony P.

  10. Hey, first of all, this is very exciting and a great step in the right direction.

    I believe that there’s a separate topic that is closely related and should be tackled at the same time if this is supposed to be used and be useful for artists in the games industry.

    For us the most useful feature in software like Substance Painter is the fact that we can write custom shaders for Substance Painter in GLSL and have a (rough) representation of our game engine shaders in the viewport AND the ability to pack textures according to the requirements of the shader.

    A simple example would be a shader that has a Material Texture which packs Roughness, Metalness, AO into one texture’s different channels.

    A more complex example might be a shader where the artist only authors the normal map and a mask texture (e.g. where to blend from painted color to metal to rust) which benefits greatly from the procedural painting tools (e.g. start from AO pass, overlay a noise generator, hand paint a few spots etc.). In this case we want both a shader that replicates what we’re doing in the engine (often already exists as an HLSL/GLSL asset) and the option to output that specific mask pushed into a specified texture.

    The current implementation of the shader graph is not suitable for quickly previewing these type of assets, as its missing simple datatypes like float4/vector4, plus re-building a shader in a different graph system is always difficult. The OSL shader node on the other hand only works in CPU rendering mode and not at all in real time. Ideally a simple GLSL shader node for Eevee would allow us to quickly get game-like shaders into the viewport.

    Second, it got mentioned earlier already, but solid PSD support for texture input and output would make a big difference.

    Lastly, any pipeline would be great to be batch-able, so in some way at least expose the workflow to Python, so that we could quickly import mesh files, run a texture generator on the mesh and bake out the textures to files.

  11. I don’t care if its a competitor to Substance or any other program, making it as simple and functional as possible is the best way forward, Blender is the way forward, by all mean add things that allow for easy transfer but i don’t want to see blender over-complicate it just to compete with the likes of adobe or anyone else, they are helpful, they clog up blender make it unwieldy. i like what is already being proposed as a way forward in this document.

  12. Please target substance painter as a goal. Performance wise, it can handle 20 million poly asset with 20 udim EASY, and a similar workflow. A mix of layer and node based workflow. You could implement a node that contains a painting layer system by example. That would make the best of both worlds.

  13. It would make it extremely easy if blender has more procedural textures in addition to cloud, wood, noise etc. It would speed up shading workflow. Just like modo has built in 147 textures which makes literally any surface to be simulated without any external texture. For reference
    https://learn.foundry.com/modo/901/content/help/pages/shading_lighting/emodo_textures.html

  14. I’d really want to see MaterialX being taken into consideration from the very start of this.

    • Material X is horrible.
      Under the hood effort why not buy i don’t want to see these horrible nodes in the software.

  15. Nodes suck. I can’t wait for Blender to compete with Adobe since they tried to rip off users of this new market when they bought the license from Allegorithmic, but I’m still rocking the version from 2019 for free. Fuck Adobe I’m never going to update to their version. I will happily use Blender instead, but it’s not all to create a powerful tool that is very hard to understand intuitively when you don’t know how to do something. It’s bad enough having to look up the UI and keyboard shortcuts every 4mn20 to use Blender, on top of regular tutorials. You can have nodes working under the hood while making the UX much better for casual users. Please don’t add the capability just for some paid plugin to do the actual job instead of Blender for free… It’s a big challenge to take on Substance Painter, and I hope you will pull it off. But make it more intuitive than Substance, not less. Blender is already bad enough as it is, even after 2.8 it is still a huge mess of elitism tricks that make no sense unless you understand every single little thing about 3D under the hood.

  16. my random ideas:
    – height map + normal map vs. vector displacement map
    https://blenderartists.org/t/can-i-generate-a-vector-displacement-map-from-normal-and-height-maps/1131781/
    – normal map vs. bent normal map
    https://blender.stackexchange.com/questions/157079/how-do-i-bake-a-bent-normal-map-in-blender
    – texture compression (DDS/KTX/BASISU) quality preview (Which the compression codec should I use correlated RGB texture or 1ch texture x3)
    – “mip trick” for NPR
    http://blog.mecheye.net/2018/03/deconstructing-the-water-effect-in-super-mario-sunshine/

  17. I see some similarities to my old addon (e.g. Layer Stack == Layer Manager, Material == Layer?, Baking ..) :), but would be happy to see it natively in Blender (https://blender.h0bb1t.de/mw/doc/current/#node-reference).

  18. This is a fantastic (and much needed) change!
    Would it be possible to recreate substance painter’s slope blur filter as part of this? That’s currently the biggest feature pushing me to them, but if blender could handle that natively it would save me a lot of time & money.
    Thanks!

  19. I just want that outdated texture tab in the properties editor to go away.

  20. isn’t UV mapping an integral part of texturing?
    I’d really love blender UV editor to have a one click function to turn every UV island of distorted quads to quads with 90° angles in the UV editor. would be oh so helpful and save a lot of time! please get this on your radar.

  21. As the only texture artist who use Blender in my studio this is awesome news. Three things in my mind :
    1. The texture layer node design is something that i have thinking for quite some times. I am surprise and happy that you guys will bring my dream (although it’s still a concept) comes to live 🙂
    2. Baking UDIM is still difficult at the moment, either using broken addon or baking one by one with workaround. I also hope Blender will have option to choose the UDIM numbering system with underscore (_) or dot (.) for example body_1001.png or body.1001.png since different studios have different pipeline to handle UDIM. My studio is using underscore at the moment.
    3. Last and the most important thing is the ability to export bitmap layered file easier. Since texturing department sometimes have to exchange data between artists with different texturing applications, and the only format we use is .psd. At the moment i use Krita to import each layer one by one and adjusting the layer opacity, layer blending mode manually and export to psd. But for the love of Blender no matter how tiresome the process and i wish we use more of open source format like .ora but sadly that’s just the way it is and i keep doing it.

  22. 1) Being able to use same texture for material, geometry deformation or particle emission is a must-have. It is also a must-have for camera projected textures, compositing and stencils for painting. There is probably a need to use similar textures for GP brushes and materials of GP strokes generated by Line Art modifier.
    Need of similar uses in different contexts is often implying similar mapping.
    I would distinguish 2D/3D/4D textures.
    2) Embedded textures are simplifying the workflow. Current textures of shader nodes are appreciated because user don’t have manipulate several datablocks with confusing names. Many people would like an Image texture without having to deal with Image datablock. Counterpart is that without a texture datablock, textures can not be reused elsewhere.
    3) I think that there should be a way to convert a group of textures embedded in material or brush (nodes datablock) into datablocks reusable elsewhere (nodegroup/nodetree datablocks).
    4) Users are not interested in baking textures. They are interested in baking channels.
    Baking can be used in conversion of a texture from one mapping to another one.
    But that is not the main goal of baking. That is not the amount of textures used in materials of the scene that defines the amount of desired outputs of baking process.
    User may want to bake shadows, reflections, GI of a scene that has absolutely no textures.
    The correct way is not to make user bake textures for one object and then pass to another one.
    There should be a collector of user requests where user defines what he wants for a selection or a unique object. And then, Blender should allow him to launch a sequence of baking processes.
    5) Probably, a warning in a node label, similar to what he have for GN should be sufficient. And if a simple button, not far from warning, allows to solve that quickly : that is perfect.
    6) Again, solution is probably similar to the ones used in GN to distinguish curve from mesh.
    7) I think that terms like texture nodegroups or texture nodetrees could help to distinguish them from texture nodes.

    • In point (4), you say users are not interested in baking textures. This isn’t true, maybe for you it is, but for many of us, it’s the only way we can use Blender, to get textures, image textures, procedural textures, into the gaming environment, via a texture bake.

  23. For baking normals from High Poly to Low Poly, one of the most frequent actions is correcting distortions on details.

    At the moment, this is implemented conveniently only in the Marmoset Toolbag, tools “Paint Skew” and “Paint Offset”.

    Are you planning to add similar tools to the Blender?

  24. Please can you make this work with vertex color layers aswell.
    Would be great if you can use all the texture combine modes with vertex color layers too. Then flatten them down to a texture or to another vertex color layer! (texture to vcol’s baking)

  25. I’m wondering why I would need two graph editors for this, I’d rather add the extra nodes to the existing shader editor if possible.

    It’s easier if I just mention some of the problems I ran into with the current texturing workflow.

    Having to create an image texture , save and wire it up so that I can use it, every time I wanted to add a channel, or layer with RGBMix node.
    Not too big a deal but it requires more steps than in other apps.

    Not being able to change the resolution nondestructively to improve performance.
    I ran into an unworkable situation with lag while working on 4k textures with UDIMs, and ended up having to switch apps to finish the job.

    It would be amazing if somehow you were able to duplicate the way some apps are able to get around this memory issue, because an awesome node system isn’t worth much if you can’t use it due to lag.

    Baking with UDIMs, I was unable to collapse the nodes to a flat image, it just didn’t bake for some reason.
    I was trying to bake my nodes in an effort to improve performance.

    Color picking is a problem sometimes, because it picks the shading on the screen when I want the actual color from the flat texture. I have to switch to flat shaded mode, color pick, then switch back to shaded mode.

    Other things that require jumping to another app.
    Not having the ability to warp a stencil before projecting a texture with it.
    Height painting causes banding no matter the texture size or bit depth, makes it unusable for me.

    Image filters I tend to use = color correction, levels, blur

    UDIM order was a problem, when different parts of the model are on UDIMs not starting from 1001, and I just want to work on a specific section of the model.
    If I have a part of the model that is starting from UDIM 1010 to 1014. I found that I had to add a bunch of small blank textures to fill the gap between 1001 to 1009, or Blender wouldn’t read the UDIMs.

    Baking UDIMs should combine shared UV tiles. If you have UVS from different parts of the model on different Shaders. This is a problem with SP that I have to fix manually after baking.

    Any Texture, UDIMs for example should be able to refresh automatically if changes are made outside of Blender.

    ———————— QA

    Is making a single texture datablock for many uses a good idea? Or does it make the workflow too fuzzy? Is there a better alternative? If not, how would filtering relevant texture assets work exactly?
    Not sure, don’t know enough to answer. I generally like it if things go back to a simple source.

    Are there better ways to integrate textures into materials than through a node? Is it worth having a type of material that uses a texture datablock directly and has no nodes at all, and potentially even embeds the texture nodes so there is no separate datablock?
    I like the nodes because I’m used to it, but maybe have both options and we’ll know better after using it.

    With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?
    If you can do the same in both, why separate it? Make it all the same shader graph, or easy access sub graph.

    What is a good baking workflow in a scene with potentially many objects with many texture layers? In a dedicated texturing application there is a clear export step, but how can we make a good workflow in Blender where users must know to bake textures before moving on to the next object?

    Sometimes I wish that I could bake the node tree to a flat texture right in the Shader graph, as easily as ticking an arrow to collapse an options panel. So a bake node maybe would be nice for fine control, or if you want to bake all channels, then have an option in the N panel.

    If there are multiple objects using the same UDIM space, it would display a list where you can check the boxes for the UDIM tiles that should be combined in this image. The node tree would still be there, after baking but you can choose to display the flattened texture or enable child nodes if you still need to make some changes.

    Some textures can remain procedural while others must be baked to work at all. How can we communicate this well to users? Is it a matter of having a Procedural / Baked switch on textures that can be manually controlled, or is there more to it?
    Manual control sounds fine.

    How do we define what a modifier is in the layer stack?
    You mean visually? Just another layer, sub layer, maybe indented or with a different color, shape, icon.

    Conceptually this can just be any texture node that has a layer input and output, but how do we find a set of such nodes and node groups that work well in a higher level UI? Possibly some asset metadata on node groups?
    How to work them? I guess if you have a layer selected, you can see it’s modifiers listed on the side main panel, just like the mesh modifier tab.

    Can we improve on the name of the Texture datablock or the new planned nodes?
    Texture datablock sounds fine.

  26. Amazing to see this finally being tackled, can’t wait !

    For large amounts of objects to be baked, there should be some way to “select” all those objects, save / edit the selection, and batch bake the whole selection. A good way to deal with this selection would be to link the objects to a collection, and simply bake recursively from this collection.

    Also, I think there should be a summary editor to see at a glance which nodes/textures should be baked, autobaked, etc, and filter them by collection/objects/node trees. A good condidate for this could be the spreadsheet editor, it would allow to quickly interact with checkboxes, and perhaps even text fields to set output paths for the textures.

  27. Are there plans to implement texture projection painting? This would place the texture on the camera plane, and offer the user the ability to paint the texture onto the geometry. Expected parameters: repeat texture, clip texture, scale etc.

    This is the first thing I would look for in a texture painting system.

    • Isn’t this already a thing with stencil mapping for brush textures? Or do you mean something different?

      • Currently this only works to one texture (channel) at a time to my knowledge. Trying to paint PBR decals is something that needs to be possible (ie paint color, normal, height, roughness, metallic channels all together when painting stencils/decals).

  28. “With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?”

    As for this one, there are two approaches for this.

    1) Recycle the useless texture editor. Make it the dedicated tool for texture creation, with its own N panel, T panel, etc. to manage inputs, outputs, parameters, etc.

    2)Make textures node groups, so we would have “Image texture” and “Procedural Texture” you place into your shader graph, tab in, do stuff and a new special node group is created you can link/append anywhere in Blender as a texture.

    And if it was possible, I’d combine both of these, having the texture editor and when you tab in you just open a instance of this mode. This way user can decide how to do things, if we cannot access the node group inners from the shader graph would make things slow.

    So if only one can exist, I absolutely would go for 2)

    • A mix of the two would solve two problems :
      – Having to switch between two editors when you are working on a single step (shading) doesn’t sound like a good workflow.
      – There still need to be a dedicated texture editor for brushes for exemple.

      IMHO the best workflow would be to tab into a “texture node group”, which will invoke a new editor with the same parameters as the shading editor (for open/closed N and T panels). This would allow to get dedicated preferences panel for texture editing, and still keep this specific editor available for other use cases.

  29. I really like the idea of materials being both able to be procedural and baked. The design is very good.

  30. I really wish Animated Brush support is there(an image sequence randomly as brush texture something like gimp animated brush(gih)

  31. This sounds fantastic and is something that I cannot wait to happen. It really is a powerful combination of proposed updates: texturing updates to Blender to make things easier to get started and improving performance for us folks who have used it for a long while now. Upgrading the texturing side would really allow my entire asset pipeline to easily stay within Blender and make it substantially easier to teach/introduce such a pipeline to others. So happy to have texturing updates being discussed and, on the horizon, and eagerly waiting what may come of it!

  32. Holy Moly.. This is one hell of a development. Kudos to the team making this happen. This is so needed at the moment.

    Really excited to use this.🤤

  33. I just hope the UDIM system gets treated more nicely this time. Currently, if you stack alpha textures from a single UDIM tile you have to input every other UDIM tile so Blender will fullfill the spaces correctly. That’s not how other DCCs works, as they will “know” how to ignore not input UDIM tiles.

  34. YES YES YES
    YES YES
    YES F*** YES F*** YES
    F***!

    Take my energy, go forth, make this happen, now! Take my money, donating right now, just, all my yes to everything here.

    Can nail down the details along the way but the general direction of this is perfect. As someone who creates largely speaking real time 3D model assets, and who has had to suffer through the current baking system for many years while waiting for any kind of improvement to it, this is everything I want.

    • Can I cry? This is so amazing and relieving. I truly feel the happiness.
      I post it here because they are looking for constructive feed-back but I can’t help and say it.

  35. “A difference is that each layer consists of all the channels defined for the texture (or a subset). Blending, masks and modifiers simultaneously affect all channels in a layer.”

    Wanted to comment on this specific point. One of the strenghts of Painter is that while this is true, you can still separate the behaviour per channel, as in changing the blending of the roughness map, or disabling/replacing the normal map, setting opacity per channel, etc.

    If Blender cannot do this, the power of the layer stack will be quite limited. Seeing that the mockup already allows to select channels, would be necessary to store whatever the user does in every channel and process things accordingly.

  36. I’m an occasional Blender hobbyist. Finding that Blender cannot easily copy selected shader nodes from one Bllender application instance (Windows process) to another (with Ctrl+C, Ctrl+V) was pretty downing. I saw a ticket explaining that it’s not a bug but a technology limitation. Hopefully this limitation will be overcome in the new system.

  37. Maybe trying to combine a live procedural texture system with an offline raster based texture system will be detrimental to the functionality of each. Would it be amazing? Absolutely, but I’m not sure I’ve seen a system like that before. Maybe there should be something like a Texture Creator (offline raster based), and the live system can be built on top of shader/geometry nodes.

    Not sure how Substance makes their graph files (.sbar) editable, but I suspect it bakes to a texture behind the scenes after recalculating changes. Blender could use a new data block to store ‘offline’ texture graphs that calculate whenever used somewhere (or baked using methods as others have said).

    You probably couldn’t animate such a texture efficiently either. Those sorts of things would be left to the live system in shader/geometry nodes.

    If I sound completely off here just ignore me, I’m probably out of my depth.

    • Well, if you reread the article carefully, the whole thing talks about the separation of the shader system with the texture system. Things like the blur node will only work in one area.

      It’s literally the whole point of it, remains to see if there would be a separate texture editor or textures would simply be node groups that change the mode when you get in.

      In one mockup pic they show the options “procedural/baked”, baked being the node having a baked result for that particular one, and procedural allowing animation of parameters, being calculated on the fly.

      And since the whole thing revolves around creating a “texture datablock”, it’s easy to see that datablock will contain baked textures as well. Textures that should be accessible for other areas of Blender, like Geometry Nodes or the compositor.

  38. My answers to the questions:
    1) You could have 2D texture datablocks be in the asset editor. Modifying one could create a new texture datablock.
    2) I think nodes are pretty good, but texture layers also have benefits. What about having a texture layer tab that automatically creates nodes when you create texture layers? That way you retain the versatility of nodes & benefit from the speed & ease of use of texture layers.
    3) I really don’t know.
    4) Nodes that require baking could have a bake button on them & a warning pin the node saying, “must be baked to take effect.” Also, the shader tab could have a “bake all nodes” button.
    5) A warning on the nodes requiring baking & in the shader tab saying, “certain nodes need to be baked to take effect.” A switch alone might not be enough. I think you need a clear warning when it’s not baked.
    6) You could use node groups to define layers. Each node group would have a layer input/output, then you could do higher level node work inside.
    7) I honestly love the new ideas you guys have presented & am very happy with Blender texturing as it is. These new texture layers will make it even better!

  39. Personally, I prefer working with nodes, so I really like the idea of having the option to get more complex using nodes. A layer system isn’t the best for procedural effects in my opinion. The only thing that I don’t like about the current system is having to create a texture for each “layer” of texturing I make, and also making sure to save that texture when I close the file. Having a dedicated texture painting node system sounds fantastic, and I hope it overcomes that little issue!
    Mari’s node system has perfected this from what I’ve seen, making the texture creation process as transparent as adding a new layer in a normal layer system. On top of that, they have a cache node that essentially bakes whatever is connected to it, so that it doesn’t need to be calculated all the time. I think that would be a great addition to this new node system

  40. Looks very useful! Here’s hoping that this will pave the way for Texture Painting to paint values into multiple channels simultaneously.

    • ^ This! At least for painting decals! Not being able to easy add PBR (multi channel) decals is keeping us on Substance.

  41. This could be useful for compositing/vfx too. You could roto and mask out footage without going to compositing tab.

  42. Good design.

    One of the strengths of SP is that apart from the materials and the procedural masks, the brushstrokes are also procedural. This allows you to paint a text at a low resolution and later raise the quality.

    I imagine that SP saves all the paths of the brush strokes and attributes such as the color or radius of the pointer among others. So at any time you can reproject the colors or patterned textures.

    In my opinion it would be something similar to Grease Pencil being able to paint the objects around it. SP does not allow modifications to the mesh and plots cannot be displayed. Blender allows you to modify the mesh for this reason I think that adapted to nodes it should be done through a utility that calls a path and allows it to be projected onto a cooking surface to convert it to a texture.

    In any case, seen in this way, it could be added in the future so that the process was completely parametric.

    Another very useful tool is the projection boxes, useful for stamping stickers by bringing the boxes closer to the surface of the 3D model.

  43. The biggest problem I find now with procedural textures is when they are used as masks for transparency which causes a significant performance drop in the program. For example, if I make a forest where all the leaves are created with procedural textures, the animations in Blender are reproduced at 5 fps (and those fps are reduced if one approaches the view little by little towards the leaves because it seems that it is readjusting the level of details as it is a vector image), but if I only bake the part of the nodes that are used to generate the mask there, the performance of the animations in Blender goes up to 15 fps. Obviously once the transparency mask is baked it means that I won’t be able to change the shape of the leaves, but at least I can modify the details inside the shape of the leaf. In other words, without the need to bake all the material completely, a considerable difference in performance is noticeable (300%).
    What I say is just an example of what happens to me sometimes but it helps me to reason the following: Is it necessary that the procedural textures are calculated all the time as vector images? Perhaps each texture could have an option to limit it to a fixed resolution in pixels, or a separate node that converts the result of a chain of procedural texture nodes to pixels (similar to how geometry nodes convert instances to real geometry with the “realize instances” node). At least that would prevent the procedural texture from consuming so many resources every time one zooms in or out. And also in this way, when it becomes a static texture in pixels, I imagine that it would be possible for there to be an official blur node for textures. I’m not saying this from a technical background, I don’t know how Blender works internally so what I’m proposing may be wrong. But from experience I know that I am not so far away either, so even if it is by coincidence I contribute this on my side.

    • “Perhaps each texture could have an option to limit it to a fixed resolution in pixels, or a separate node that converts the result of a chain of procedural texture nodes to pixels”

      I guess this could be used to also have the procedural textures generate different LODs (even automatically perhaps in combination with some node to specify viewing distances)…

  44. It would be handy to integrate textures in general in the asset browser, so you can design your own texture-patterns and use them later or share them with others.

  45. I have a suggestion for baking.. there can be a node which bakes its input to a texture.. it can have params like size, image type, output file, sampling etc etc.. a check box to auto bake to (triggered if the input node graph changes) and a button to bake manually.. output is color data which can be plugged into BSDF inputs..

    • furthermore for mesh bakes (normal, ao and cavity etc) it should be added to the N panel.. so the mesh to which the material is applied is used and hi poly for it can be added there… an auto system based on names (e.g. _hi and _lo) can also be used here

  46. This is an excellent proposal and I’m glad to see Blender’s texture subsystem getting attention. I had a thought regarding this question:

    “With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?”

    My suggestion would be, instead of creating a new texture node editor with its own workspace etc., why not make this an extension of the shader/geometry node editors? A user would add texture nodes to a node tree as they already do, and can then Tab into the texture nodes (like we do with node groups) to edit the texture’s own node tree.

    There could be a button on the node that accomplishes the same as hitting Tab (so it’s not a completely invisible feature), and also a Bake button for nodes that require it.

    I think this makes the overall interface and usage more intuitive and reduces need to jump back and forth between editors/workspaces.

    • I agree editing the texture nodes inside shader or geometry nodes should just be a tab way.

      There are other use cases like brush textures or assigning textures in a modifier stack, for which a dedicated texture node editor seems needed still.

      • why not add a node “output to texture” inside shader nodes? like the viewer and composite nodes, or the material output on shader nodes

        • That would imply to create a material unused by objects of scene to be able to store nodetrees relative to brush textures.

    • Nice idea. It reminded me of editing mesh.

  47. In the Layering of the Textures inside the Texture Block – I see the intention is to provide a way to mask between each, but I don’t see a way to mask the actual Blur node affecting the texture block there except to duplicate a texture and mask between?

    Procedural textures are needing a way to drive the results similar to Nodevember’s results, much more than just adding a Voronoi and feeding a Noise node into it – so I wonder if there will be a way to work with the procedurals inside a group node that allows complex tree work and then can be brought in and added to a texture block.

    Brush and Brush Masks are a strength in Blender with using Procedurals – but the limit has always been that you need to set one on the brush, and one on the mask and possibly you can then also use a Mask Tool layer set to UV coordinate and a Cavity Mask turned on. IF we got a more robust brush system derived from the OLD texture node engine with procedurals AND we got dedicated Image Texture Layers inside that, we could do some serious work on making better more physically real brushes and still then have better painting.

    I applaud the direction this is going – handling this manually has been a chore for some time – but I do think that a Node Tree background is essential to allowing total freedom in design here – going backward into a generic 2d editor layer stack would defeat the purpose of this, and we already have Gimp.

    I would love to see the Texture Image Block be something that would allow the user to do a bunch of work inside there, and then in the Shader Tree one could add a ‘Texture Block” node that would have a tick box to ‘render’ the contents for use in the Shader, and unchecking the box would mute that node while rework or other painting and filtering was going on that cannot be supported in the Shader as yet. Finish the work on Texture Block and then tick that box again, boom, you have the complex result inside the shader without recalculating at every movement of the viewport.

    I will be watching with anticipation 😀
    Craigo

    • How to mask a Blur node is a good question, I’m actually not sure what the best way would be. The simplest would for all modifier nodes to have an input for that purpose. If it’s a general feature of the layer stack, it’s not so obvious to me what the corresponding texture nodes should look like.

      Texture nodes should definitely support node groups. In some ways a texture datablock would effectively be a node group already, it’s not clear to me yet if we even need a distinction.

      For brushes, having the full texture nodes available would indeed be powerful. Making that efficient will be a challenge though.

  48. Hello Brecht,
    Nice design ❤️, although it is a little overwhelming.
    It is not clear if a channel contains the layers or if the other way.
    In my opinion,
    Baking should be automatic(in material level) by caching or freezed to disk. Also IMHO, there should be an abstracted caching system that should be extended to work for animation both on texturing and simulation.
    Thanks for your work
    Kind regards
    N.M.

  49. If there’s going to be a common subset of nodes that works across all tree types, are textures the best interface?
    Will texture nodes end up being used as a workaround for non-texturing use cases?
    Could make more sense to have “function” nodes?
    If some nodes (like blur) split the workflow and the evaluation into 2 different paths, do they belong to the same interface?
    Will there be overlapping functionality between texture and compositing nodes?
    Will there be non-overlapping functionality between texture and compositing nodes?

  50. A wide and diverse set of really good procedural textures that genuinely animate with time (ie 4D) is the essential building block to procedural textures. Being able to properly animate (not just move location) is essential (and Cinema 4D’s ability to loop over s set period is genius)… so whilst you look at the procedural texturing in general for the next versions please do bear this in mind! 🙂 Thanks. Regards layering… I think others will have more experience than me so will look on…

In order to prevent spam, comments are closed 45 days after the post is published.
Feel free to resume the conversation on the forums.