Nodes Modifier Part I: Sprites

This initial milestone of the Everything Nodes Project will focus on the features that can be used by the Blender Studio. Now that the Sprite Fright short film has been officially announced, the use cases can finally be discussed publicly.

The Everything Nodes Project was started in 2019 with the Particles Nodes Project. The focus at the time was dynamic particles simulation. Around August it was decided to put the project on hold to make sure the design was on point. This was followed by a particle workshop in September where the groundwork for the design was laid.

Moving forward, the focus shifted to tangible use cases that could be validated in production. The priority then shifted to work on geometry nodes, more specifically particle scattering for set dressing. Those were the features the Blender Studio artists were looking forward to the most.


The geometry nodes follow the design of the particle system projects, but its focus is more narrow, and it addresses only static particles.

The underlying design is still the same as for the particle system, with a well-defined integration in the modifier stack as well as a clear dataflow.

The modifier’s input and output are directly connected to the non-node based modifiers. It also contains a node group that can be shared across different objects and Blender files.

Sprints and Agile

The team is working in a squad following the Scrum methodology. The project has two-week periods (sprints) where the team aims to achieve tangible results.

The initial sprint, October 19 to 30, was a mix of preparing the ground work as well as organizing the backlog for the upcoming sprints.

Sample file prepared to validate the pebbles scattering use case

The second sprint aims at finishing up the first use case (pebble scattering) and having it fully integrated in master for the upcoming Blender 2.92.

The third sprint will focus on finishing set dressing for the Sprite project. It includes tree and flowers scattering, as well as procedural modeling for tree bark and moss (pending node design).

The team is still adjusting the planning to its delivery capacity. So there is a chance that procedural modeling happens as a separate sprint.

Plan of Attack and Backlog

There will be a constant balance between polishing and developing new features. This will depend on feedback received through the iterations.

The backlog is organized to try to match the immediate needs of Sprite Fright:

  • Pebbles scattering – set dressing
  • Flowers and trees scattering – set dressing
  • Tree bark – procedural modeling
  • Moss on tree – procedural modeling
  • Campfire – mantaflow integration
  • Hair spray – dynamic particles
  • Salt – dynamic particles
  • Snails crawling – dynamic particles

After this is out of the way, there are other features that would have helped previous open movies:

The goal of the team is to cover the basics for well-defined use cases and hand it back to the community.

With a well-defined design and the core functionality guaranteed to be solid, the community should be able to bring the feature-set to completion.

Meet the Team

The team consists of:

  • Dalai Felinto – Product Owner
  • Hans Goudey – Developer
  • Jacques Lucke – Lead Developer
  • Jeroen Bakker – Scrum Master
  • Pablo Vazquez – Product Design and Testing
  • Simon Thommes – Feature Requirements and Demos

Community Involvement

The work is organized publicly and the team welcomes anyone interested in contributing to development. And this has started already: contributor Léo Depoix implemented the subdivision surface node.

The sprints backlog is confirmed at the beginning of each sprint, and can be seen in the workboard.

Simplified view of the geometry nodes work board.

Tasks that are critical to the sprint are handled by the team. However, there are always plenty of “good to have” features that are not prioritized by the team such as extra nodes and refactoring.

Follow the discussions on geometry-nodes-squad on (read-only). Contact the team members to find what you can help with.

Updates on this project will be communicated via the bf-committers mailing list. To help testing the features join the discussion on

  1. First of all, just wanna take a moment to say thanks for choosing nodes as the future of Blender. This is an absolute game changer and I’ve already seen industry professionals doing really advanced stuffs and switching to Blender and for that congratulation, the future is way bigger than most people can imagine! the devs are clearly wizards. Now just polishing the features and making the features more stable and more robust, so artist can push Blender as much as they want with the new awesome tools. Simply the future is way way bigger than most people can imagine, nodes and proceduralism is the biggest reason and the flexibility why the other software starts with H is taking over the industry to the point even animations are done with nodes but Blender already has few legs up because it has very robust traditional 3D tools and I’ve talked with some senior 3d artists who’ve been using 3D software for over 20+ years and pretty much the response was when shader, modeling and physics all the gets nodes and they are well connected, it will change everything, so congratulations for that in advance because it’s been less than a year and I’ve already seen some pretty nuts stuffs made by people who weren’t using blender before but now they are hooked onto it.

  2. This does sound very good! But I’m wondering how the new scatter system will differ from what is currently available with instances?

    • Better control of orientation, different distribution methods, and overall a more flexible system, with a better separation between the high level settings that the users need to worry about, and the internal building blocks.

  3. So it is now possible to Load finally Pointcloud alembic with orient attribute and scatter object on that pointcloud useing orient rotation? and pscale?

    • It will depend on how the alembic importer is handling custom data. But in theory, it should work. Could you send me one of those example alembic file for me to test? (dalai at blender dot org)

  4. Exciting indeed, Looking forward to.

    This seems like a place in which we can get our voice heard by the devs.

    Can we start with more basic things, such as…

    – Vertex group support for mirror modifiers [not for UV mapping but for the actual mesh] (There are many cases in which the users don’t want to mirror the ENTIRE mesh).

    – More work on your physics solvers for clothes and soft bodies that tends to “explode” under a lot of settings as well as get blender into a state of non-responsiveness [Even for low polygon meshes on high end CPUs (high end CPUs For clients, not servers)].

    – Handling of small scale calculations (Objects with small dimensions) with rigid bodies without resorting into changing the actual measurement unit scale.

    – Extending custom properties usage within blender without defaulting to python.

    – Having the ability to copy “color data as driver”.

    – Supporting True / False nodes as values in node editor without plugins.

    All that said, I do enjoy blender and the developers are doing great work, I am thankful, Keep it up.

  5. Super exciting. The parametrization of Blender is the next big evolution.
    Maybe some of the work on Sverchok could be looked at for fit within this effort?

  6. “Random Rotation” shouldn’t be a node. Ideally you’d have a rotation node and a random node. That approach is far more scalable and you can make a macro of the 2 if you really want it.

    • The image you see is actually showing exactly that. A few building blocks (nodes) that together are the “Random Rotation” node (group node or some kind of node collection used to group sockets together to organize the UI)..

      • Sorry Dalai, If I’d taken the time to more than glance I’d have seen this. Thank you for taking the time to correct me.

  7. Are there any projections when will Blender become everything nodes complete?

    Also are there any projections when will Blender be ported to Vulkan and become Graphics API agnostic?

    I think these would be necessary milestones (modernising the code base) to let more people start contributing to the project.

  8. I would like to make a suggestion regarding the point cloud/instancing workflow.
    The point cloud seems to have possible arbitrary attributes per point which can be set. This should make it possible to have per point loc/rot/scale information for the instances. The modifier with the point instance node however seems to create one mesh (or type). The missing attribute values would have to come from inside the node tree before the instancing which is also ok.

    My question is if it maybe also possible to have more generic outputs of the node group like Vector/VectorArray, Float/FloatArray, Matrix/Matrixarray which can drive attributes/properties directly similar to drivers.
    For example we could have a node tree with multiple output nodes with arbitrary data and in the interface we could right click > use tree.outputnode.attribute to set a value.

    • Interesting concept, specially when combined with shaders. But probably I would first try to have a design where the attributes are created ahead of time, and edited in the nodetree. This way things are more transparent to the user.

  9. Could this mean that Nurbs modeling operations are about to get off the ground?

  10. OMG this is the most exciting Blender feature I’ve heard about in years. It’s going to be huge for environments. And the fact that it’s being developed for Sprite Fright gives me hope that it’ll be deployed. Can’t wait.

  11. If you need more inspiration and ideas, take a look at Clarisse IFX by Isotropix. They have a free personal learning edition.

    Beyond what Houdini can do with nodes, Clarisse is made so it can handle extremely large scenes and allows scene organization in pretty much any way you want, much like organizing files in folders. It does not have as many rules and limitations as Blender does in its outliner, so there is much less “why can’t I do this or that”.

    You are free to mix shaders, textures, geometry, generators, modifiers, renderers, image outputs, script nodes, etc. in the scene graph and group (contextualize) them as you see fit.

    The scene can both be presented in a Blender-like outliner, as node graphs, in a searchable table or even as programmable rule lists, so you can comfortably work with scenes with thousands of nodes, while focusing only on the relevant parts.

    Exporting parts of a scene into separate projects and importing those projects into new projects is handled very elegantly.

    Clarisse feels a lot like the programmers wanted to allow you to make a scene with the Grand Canyon, and you should be able to, unhindered, focus on a single pebble, among billions, in the scene.

    • Clarisse instance handling is fantastic and it’s reference system is perfect for procedural changes and organization but it’s interface is horrendous and a very bad example for an artist tool, overparametrization and what it feels like endless clicks and windows just to create a basic scene.

  12. Blog post updated with a link to a new thread on devtalk.

  13. Hi) Nodes are long hoped-for project, that finally got off the ground.
    I think, through nodes, blender can be used of motion design. Scattering more objects – in mograph are one of the main features.
    And when the node particles, and some animation nodes, like AN addons are ready – real madness will begin :)

  14. Hey wait a minute! shouldn’t an EditMesh (modifier) Node be the first and most basic node you should have working first on this wonderful project? i’m just looking for a way out of 3dsmax and a EditMesh node is my ticket out! :)

    great work guys

    • Hi, in the Blender official channels there is only discussion of design and solutions that are either agnostic, part of the Blender design, or from a GPL-compatible software.

  15. Thanks for the heads up Dalai. Even with a bunch of available nodes I was able to do this

  16. I thought we’re getting particle nodes first, very disappointed.

  17. This is looking really awesome! Geometry nodes as node groups brings a lot of flexibility and improves workflows that are currently tedious or clunky, like sharing modifier parameters across different objects. Combined with the asset manager it may bring an easy way to make and share custom modifiers.
    Loving it.

  18. Amazing news !
    this kind of content could greatly use some textures, but the texture editor is still completely broken, wouldn’t be wiser to work/fix on the texture data type first ?


    • While I agree, particle nodes have been kicked down the road for years. It’s high time it got a proper focus.

  19. Fantastic ! this plan washes away most of the worries I had when I first heard “pebbles scattering”. This plan covers a lot of use cases, and different data types as well (moss as hair, tree bark as geometry, campfire as a volume, snails as dynamic particles, pebbles as static particles…).
    I frankly feel that this project is in excellent hands. I tried the branch yesterday but there ain’t much to do yet ! Can’t wait to see progress !

    • This is great to hear, thank you. The goal is to merge things in master continuously. But meanwhile I will try to force the branch to rebuild every week.

  20. and.. Please Blender Engine for UE4 , Unity, Maya.. …

    • This is such a huuge addition, this could put Blender on a whole new level and compete even more other softwares, ou guys are Heros !
      A particular interest in such kind of workflow is to be able to integrate it on other softwares/engine or be able to run it offline, is this something you have considered yet ?

  21. Pebble scattering. Another pie-in-the-sky feature that isn’t urgently needed. While basic functionality like groups that people have been asking for a decade continues to be rejected.

    • Hi, set dressing is the most required feature by the Blender Studio artists when they think about particles. It is also a good tangible initial target to tackle to help validate the system design before going too deep into dynamic particles. This is what was taken into consideration when we decided on this deliverable.

    • I assure you, particle nodes have been waited on just as long. It’s a missing feature that environment artists have wanted for years. I’m grateful it’s finally happening.

  22. Looking at the plan of attack things seem to be a tad confusing. Is the plan to go back and forth between geo nodes, dynamic particles and hair?

    My issue being, looking at the plan it’s hard to get a clear picture when a usable hair/particle system is going to be in place.

    Is this cut out mainly to suit the needs of the BS for their newest openmovie project?

    • The first part of the backlog is in the original planned order the team plans to tackle it. The second part may change once we get closer to it. It may even be tackled in a different way by different teams. It is too early to tell now as we are experimenting with scrum/team still. And we are still too far from that to set the priority order in stone.

  23. what’s up with the title “Product Owner” Dalai?? sounds very commercial.

  24. Yay, Blendini slowly becoming real

  25. This is exciting!

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.