Siggraph 2018: A developer’s viewpoint

This year, Monique Dewanchand and I were able to attend SIGGRAPH in Vancouver. Together we have a small company called At Mind, doing lots of Blender development and projects. Below is my report with observations of what the state of open source at SIGGRAPH is.

The past years, several VFX companies have opened their pipeline. This has resulted in Open Source projects, were they have presented their common challenges.

We went to several Open Source BoF meetings and met the people and studios behind the projects.

USD / Hydra

When I first heard of Universal Scene Description I thought that it was just a file format for data exchange that was capable of loading your scene very fast. At SIGGRAPH I saw the power of USD and the problems they want to solve.

USD is positioned as the data-backbone for studios. Studios can use it from Layout to final rendering and it is optimized for artistic iterations and even makes it possible that multiple artists work on the same asset at the same time. USD can be used to transfer data between departments and easily integrate the result of the work back.

When working with large scenes in Blender, you need to load all the data. Blender controls all the assets that are loaded what makes working with huge scenes not as fast as you want it to be.

Hydra is a viewport that can be integrated with tools to display the USD. You are able to select a renderer to render the scene to your display. It has an OpenGL back-end, but also back-ends for interactive renderers can be developed. I would like to see Cycles hooked up with Hydra.

USD/Hydra has its own dependency graph that is optimized for large scenes. I saw a demonstration where the Hydra viewport was integrated in Maya that displayed a very complex VFX scene. Only the part that the artist was working on (armature) was loaded in Maya so it interacted really quick.

In theory Hydra fits with Blender’s 2.8 viewport concept. There are technical challenges, like Hydra requires OpenGL 4.0 to run. Further research on this subject will probably find a solution.

MaterialX / ShaderX / OSL / Gaffer

When using multiple applications materials can be rendered differently. MaterialX (http://www.materialx.org/) addresses the lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a model from one application or rendering platform to another. This includes shading setup, patterns and texturing, nested materials and geometric assignments.

MaterialX describes materials. It can also have multiple looks per material, for example one for dry condition and one for wet condition. MaterialX is integrated in many tools nowadays. This gives a similar representation of materials in different these tools.

MaterialX has some criteria before it can be used. You need to support a common set of BxDF shaders across the tools you use otherwise there might be visual differences. MaterialX library currently provides OSL definitions for the some basic nodes to make them straightforward to implement. In the future they hope that GLSL definitions and other common shading languages can be made available.

MaterialX has a well defined set of standard nodes for texturing, including reference implementation. However for shading there is no description or standard library. The surface, light or volume shaders are threaded in MaterialX as black boxes. ShaderX is an extension for MaterialX. ShaderX defines building blocks for describing shaders as well as supplying functionality to convert these shaders to OSL/GLSL source code. This code can then be compiled and used by applications and renderers.

A number of studios nowadays use Gaffer (http://www.gafferhq.org/). I never heard abour Gaffer, but met a developer of Gaffer at Beers of a Feather. He described Gaffer as an open source project for look-development. I was interested as I worked on the look-development in Blender 2.8.

Gaffer is a node based configuration of your scene, render layers, material assignment and triggering your rendering or compositing task. The configuration can be shared between scenes. For example a character will always be rendered in the character render layer. Which saves a lot of time setting up your rendering. Blender also has RenderLayers, but that needs to be set-up per scene. Having a node based system to configure the RenderLayers and when the renderer/compositor/sequencer is called would give Blender a lot of extra flexibility.

OpenColorIO

The past years not much has happened to OCIO. But nowadays even Autodesk is replacing their internal CM with OCIO. Changes they made will be released with the v2.0 release.

In previous versions there is a difference between the GPU and CPU implementation of OCIO. The GPU uses a baked LUT, but the CPU interprets the the Color Configuration directly. V2.0 gives you the choice of using the LUT or use the Color configuration on the GPU.

Studios need to create their own OCIO configuration. Most parts of these configurations are exactly the same between studios, for example the OCIO information for a camera or standard color spaces. The studios want to address this issue. Perhaps it won’t be part of OCIO, as it does not fit in the scope of OCIO.

VFX Reference Platform

The VFX reference platform (https://www.vfxplatform.com/) is something completely different. It is not an open source project, but an attempt of the industry to standardize on software versions, to ensure better binary compatibility. Ideally studios can develop add-ons, that run in multiple applications from a single code base or binary.

The power of this kind of platform is that the industry comes together and talk about very difficult topics, like migration from Python 2 to Python 3. Blender did it years ago (Blender 2.5, 2009). But in my experience this is a very difficult task. I was impressed in how it was discussed between studios and software vendors. Result: in 2019 the mayor software packages will support both python 2 and 3, so the studios can start migrating their pipelines. The aim is that python 3 will be the standard in 2020.

For Blender this reference platform is something we could look into so we can support studios better.

 

13 comments
  1. When will blender work with skyrim. For me blender does not work tell it supports my game of choice skyrim.

  2. I would like to see a vst work inside blender
    or create 3D graphics by playing with audio waves

  3. We need to type Arabic text, but in blender 2.8 also can’t do it, because Arabic text is Right to left Paragraf, can it improve ?

  4. Trying to match technology with other software is important. (vfx reference platform)
    We didn’t integrate blender into our pipeline because of the python version miss-match. More compatible you are, the more usage you will see.

  5. It’s super cool that so many are becoming open source and it’s also cool that the industry is moving in the direction of program independance.
    When Autodesk axed Softimage it seems that a lot of animators and animation studios were caught off-guard how quickly it happened in the end and how bad the whole thing is for an established pipeline. If we could at some point start archiving all our work in a universal format that can be interpretet across many plattforms so that there is free choice – that would be a fairytale dream for digital artists.

    finally no more “Make sure to use this prpogram to open our shaders” or “please make sure to get our look descriptions right” or “I hope the program will be supported in a few years when our Library has reached over 1 TB of data” ….
    This can’t come fast enough for me! Especially if it’s designed to work across games and offline rendering!

  6. Hi,

    One of my backburner projects I’d like to build Gaffer against Blender 2.80 eventually, I’ve got a very old PR that someone else made as a backup, which I’ve cleaned up a bit (Cortex has code like USD/Alembic in that it converts to/from DCC app formats):
    https://github.com/boberfly/cortex/tree/contrib/blender

    And eventually will have Gaffer manage Cycles:
    https://github.com/boberfly/gaffer/tree/GafferCycles

    I’ve also been porting Gaffer to Windows (it doesn’t contain the above code):
    https://github.com/boberfly/gaffer/releases/download/0.46.0.0/gaffer-0.46.0.0-windows.zip

    Cheers….

  7. I really like seeing these open-source advancements. I would love to see a replacement for FBX – something NOT controlled by Autodesk (glTF, maybe?)

  8. Has long as I am concerned, OCIO has very much delivered. The future of it look like it is only polishing and stabilization, but the initial goal was reached. It’s integration into Blender could still be much better though.

    • as long as …
      sorry

      • “as far as” sorry…but not really….we knew what he meant initially.

        I’m a little upset BF didn’t have even a tiny cardboard box this year….maybe next year someone will help sponsor..

        • ignore my comment…I thought someone had corrected you..and I have low blood sugar(it contributes to my irrational behavior)

  9. It should be noted that USD/HYDRA does not require openGL 4.0 necessarily. Hydra is kinda a “renderer middle layer” meant to go between the scene delegate (blender) and the render delegate (renderer). The point of that middle layer is the renderer now doesn’t have to pull directly from the app like blender.
    The demo at siggraph by luma pictures was pulling in data from a Maya “Scene delegate” and rendering through the Hydra viewport.

    So the default viewport render engine hydra comes with is based off OpenGL, but there is also a Metal based render delegate built by Apple, as well as render delegates being worked on for many major renderers, Arnold, RenderMan, and we demo’ed one for ProRender. But the OpenGL 4.0 requirement is only if you used the default GL renderer it comes with.

    (Also I should note I’ve started work on adapting USD to Blender as an addon, the harder part is the python bindings of USD are all based around Python2 Boost. If anyone is interested in helping out let me know.)

    • Thanks for the insights! Nice to see that there is already some work started on USD and Blender.
      I haven’t tried, but boost.python has a switch that can be configured in the build system to generate Python3 bindings.

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.