Blender Dependency Graph Branch for users

Hello! I”m visiting here to talk about work being done by Sergey, Joshua, Lukas and others updating Blender”s dependency graph. Anyone can test it by building the depsgraph_refactor branch from git.

How?

To make things interesting I”m testing on Elephants Dream files. To do this, I also have to update the project to work in post 2.5 blender ! This has the effect of exposing bugs/ todos in the branch by exposing it to a large set of working files, that have to match their previous known behavior. As a side effect, Blender Cloud subscribers and others should gain access to an updated Elephants Dream, and we”ll have a couple of new addons to update old files, and to create walk cycles on paths. Not to be stuck on old things, I”m also creating some useful rigs that are impossible without the refactor.

But, what is it?

Well, what is this “depsgraph” anyway, and why does it need updating? Simply put, without a depsgraph, you would not be able to have things like constraints, drivers or modifiers or even simple object parenting working in a reliable way. As we make our complicated networks of relationships, Blender internally builds a “A depends on B depends on C” type of network, that looks very much like a compositing node network. With this network, and for each frame, blender knows to update A before it updates B before it updates C. This is how, for instance, Child objects can inherit their parents” transforms before updating themselves.

Why is it being updated?

The current Dependency graph was written during Elephants Dream (haha! the circle is complete). This is way before the modern animation system of “everything can be animated” we have now. That design really worked for the rigid old system, in which only specific properties could be animated. Starting from 2.5 and until now, only dependencies that worked in 2.4x could reliably be expected to work, even though the interface allows you to create them. Think of driving a bone”s transform with another bone”s transform in the same rig, or parenting an empty to the body of a character, then IK”ing the arm to that empty, or trying to get a flower to open up based on the brightness of the sun lamp…. Even worse, the interface fully allows you to set up these drivers, but after you do, you get some strange lags and stutters, with very limited feedback as to why this happens. Previous patches enabled some very specific new setups, while not really changing the system under the hood. With the update, we can expect these setups and more to work, in a predictable and speedy way. This also lays the groundwork for future changes in blender, such as creating a new node system for modifiers constraints transforms particles, basically enabling more procedural-ism and flexible rigging. For now, in addition to “Animate all the things” we will be able to “Drive all the things” – very cool.

Introducing Dr. Dream


It turns out old Elephants Dream files *almost* work in 2.5 – 2.7, with the following exceptions:

  1. Action Constraints in Proog and Emo had “wrong angles” due to a bug in the old constraint. Since it got fixed, these numbers have to be updated.
  2. Shapekey drivers have different data-paths, reference shapekeys by number instead of by names, and making driven shapes broken.
  3. We used an old NLA feature that allows putting groups in the NLA and having strips refer to the rig inside the groups. This feature was removed during the animation system recode, and all that animation just stopped working – this is mainly true for all the robotic ducks in the background of shots.
  4. Another (terrible!) feature was the whole stride bone offsetting for walkcycles, that allowed for characters walking on paths. It was cumbersome to set up and resulted in much sliding of feet, and thus was never recoded in the new animation system. Which means all our walking-on-paths characters don”t walk anymore.
  5. Some cyclical dependencies (Empty -> Armature -> Empty again) cause bad/laggy evaluation. We simply got away with this in the few shots that it happens, but it is not guaranteed to ever render correctly again (even on 2.4!!!)
  6. Proog, Emo and animated characters are local in each shot, meaning fixes have to happen in every file.

To solve problem 1 – 3 I wrote an addon called Dr Dream – an inside joke we used to call many Elephants Dream scripts “Dr” something, and because this Dr. is actually helping the patient work in new blenders. Dr Dream also handles problem number 6 – being a script, it can be run in every file, fixing the local characters.

To solve problem 5 I will do the following: Nothing. The depsgraph refactor will take care of this for me!!!!

Problem 4 requires coding a python solution, this is a big project, and will be the subject of future post.

New Setup: soft IK


I”ll do a series of posts on useful rigging tricks possible in depsgraph_refactor. This current one is possible to add into existing and animated rigs – even Elephants Dream ones – and was not possible before the refactor, because it relies on driving the transformation of one bone by another in the same armature object. Some of the animators among you may have noticed a problem when animating IK legs: as the legs go from bent to straight (and sometimes bent again, like during a walk), the knees appear to “pop” in a distracting way. The reason turns out to be simple math: as the chain straightens, the velocity of the knee increases (in theory to infinity) causing the knee to pop at those frames. There”s a couple of excellent blog posts about the math and theory behind this here and here, and an old blog about in blender here.
If you want to check out the blend file in that video, you can download the blend here. Note that I”ve exaggerated the soft distance, it really works fine at 0.01 or less; you can edit the number in line 6 of lengthgetter.py, and then just rerun the script to see the effect. Too high a value (what I have) can make the character seem very bent-legged.

26 comments
  1. Thank you Bassam for sharing this information.

    Is there another link for your example blend file? I am unable to download it since the link in this post is no longer valid.

    Thank you.

  2. Hi Bassam,

    sorry for the late question, but I was wondering will the new dependency graph tackle orsolve any of the issues regarding Spline IK in Blender?

  3. Hi Bassam
    I’ve been working on implementing a soft IK setup on an existing rig generated by a makehuman character. The setup is directly inspired by this article. The system seems very simple when you explains it but I have been losing grasps of hair making it work on my model. I must confess that rigging remains somehow obscure to me when it gets too complex.
    The general setup seems OK but I can’t make the leg stand in a straight position and it remains bent no matter what. Here’s the link to my blend file. I’ve only made the setup on the left leg:
    http://www.pasteall.org/blend/37042
    Has anyone an idea on how to make it work properly?

    By the way, if there’s a coder in tha place who wants to make it an addon…be blessed forever.

    Cheers
    Fred

  4. Anyone knows how to enable the new dependency graph?? Where I put –enable-new-depsgraph?

    • When you run blender. Instead of running by double click the file, you have to run with a console. In Windows, it is “cmd”, in Linux, use the Terminal. Go to its directory, and write “blender –enable-new-depsgraphs”, without quotes.

  5. Does it work in BlenderPlayer, too?

    • I mean, will we be able to use something like “blenderplayer –enable-new-depsgraph file.blend”?

  6. Hi the new dependency graph have a solution for the IK Spline system???, Is important use de Ik Spline in only one Armature, pliss fix this.

  7. I have experiment an issue with constraints evaluation. You can see the details here:
    http://community.cgcookie.com/t/is-this-a-bug-with-constraints-and-armatures/780

  8. As it seems this is like a general hurdle of all IK chains in animation, are there any reasons to not integrate this function into the IK-Constraint itself as a checkbox?
    By using the same IK-target and just refer to the original length from Bone1-Head to Bone?-Tail should make settings remain managable.

    I think many users would be thankful.

  9. “Depsgraph’ have some influence on the yield of BGE in terms of FPS?

  10. Hey Bassam,
    Always enjoy helping with test builds so the devs can get feedback on their progress :)

    I uploaded a new build to graphicall @ http://graphicall.org/1145
    new build = Depsgraph_Refactor_Win7.x64-b4deb49

    Since I can’t edit my post here, I can cut down on noise for new builds :p

    just let me know of any links you guys want added to it :)
    Cheers,
    ~Tung

  11. awesome, thanks TUNG!

  12. As in…
    https://developer.blender.org/diffusion/B/browse/depsgraph_refactor/
    if you build master now, just use a git bash terminal and enter…
    git checkout depsgraph_refactor
    then your in that branch

  13. Paul, I guess that’s not known yet, but I’ve found the branch to be quite stable – I’ll continue to test and give feedback to the coders working on it, and I hope it can be merged soon.

  14. This is really cool. I always wondered why the ik stays behind like that sometimes in some rigs but not others.
    Any ideas when this will make it to master?

  15. Hey Stephen, to be fair, experienced animators can usually handle the pop, by finessing the speed by hand – but this allows faster work, especially if you’re doing a ton of cycles for bg characters. I’ve been thinking I could try making a simple addon to ‘soften’ IK on an existing rig (with some input from users, namely selecting the IK target and root of the chain) would that be helpful?

    Tom , I wasn’t aware of self in PyDrivers – I guess it was already gone when I used them. Do you know why it was removed? If it was for dependency reasons, maybe it could come back with the refactor. Instead of self, you could just write an external script to add your own functions into the driver (As I did here) and then register them and use them in your driver. That way you wouldn’t have to mess with frame change handlers and you’d know exactly where in the stack your driver is executing – basically just write your function with full access to bpy module, and register it with bpy.app.driver_namespace – you can see that in lengthgetter.py in the blend I posted.

    • BASSAM said: …just write your function with full access to bpy module, and register it with bpy.app.driver_namespace…

      Thank you; but I don’t know how to do that,
      given the fact that I don’t know the names of the objects affected:
      they will be objects duplicated via duplifaces (but then I don’t get the “choose one from a group” option),
      or via particles (but then I don’t get the scaling option of duplifaces),
      or via a Python script.

      The .blend you posted? Where?

  16. Because of no more “.self” in PyDriver expressions [could someone PLEASE re-instate that?],
    I’m writing a lot of Python to set ShapeKey values (of wooly fur) in response to the angle
    that a rigged and wooly character’s skin facets’ normals make with global Z, and, of course,
    I want such a script to execute at every frame, with bpy.app.handlers.frame_change_pre, yes?

    Question:
    Will this first calculate the ShapeKey values, then move the character?
    Is there, or will there be, a way in which the user can specify what to do first?

  17. I didn’t know I needed this and now I can’t live without it! Thanks to you all for this excellent work. I thought knee popping was all my fault and just something that would always need tweaking to get right :)

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.