First steps with Universal Scene Description

A shot of Spring, exported to USD via the very much work-in-progress USD support in Blender.

This article was updated on 2019-07-26.

Universal Scene Description (USD) is a file format created by Pixar. To create, manipulate, and read those files, they have also released their C++ library under the MIT Open Source license. Since the work on Toy Story started, Pixar have been trying to solve the challenge of having large teams of people working on a consistent set of digital files, with the aim of creating a feature film. This work culminated in USD, and with all the filmmaking experience of Pixar poured into an Open Source library it’s no wonder people want to see USD support in Blender as well.

A bit of Spring in USD.

In the past month I have been working on adding a prototype USD exporter to Blender. In this blog post I will explain what has happened so far and what steps are needed to move forward from here.

What happened so far

The USD library is rather pleasant to work with, so creating a simple exporter went pretty smoothly. The current work is loosely based on the Alembic exporter, with of course some improvements that’ll circle back to Alembic at some point. What we can do so far is:

  • Export a single frame or all frames in the scene,
  • write to a single USDA (ASCII as in the screenshot above) or USDC (a more efficient binary format) file,
  • export static and animated meshes with modifiers applied, including UV coordinates and edge creases,
  • write simple materials based on the viewport settings,
  • assign materials to subsets of the geometry (I think this works, but the Hydra viewport doesn’t support this yet, so I can’t visually verify),
  • export hair curves (just the parent strands, and not coloured yet),
  • export lamps (except spot lights, which USD doesn’t seem to support),
  • choose between Render of Viewport settings (for Simplify, modifier settings, etc.) to determine what ends up in the USD file.

What is not there yet

The way the exporter works now is rather limited, for various reasons. To start, writing an entire scene to a single USD file is not the way to go. This is evident when looking at the USD example files from Pixar, where every object is split up into three files and then referenced from the set definition file to form the stage on which characters can be animated. The Kitchen Set consists of no less than 230 files.

Kitchen Set example | Copyright © Disney/Pixar
Modeled by Christina Faraj based on concept art by Paul Felix for Lilo and Stitch

Splitting up the scene into smaller files allows more efficient loading, writing, and modifying a scene, and in general makes it possible to work together without stepping on each others’ toes. However, how to exactly break up a Blend file into small USD files, and subsequently how to tie those files back into a USD file that shows the same as you had in Blender, that’s still an open question. This is likely to be studio-dependent as well, so requires flexibility from Blender’s side.

Then there is the concept of instancing. It is possible, both in Blender and USD, to have many copies of the same object without bogging down your computer too much. For Spring a shot with >10,000 individual objects is quite common; many of those are instances of the same pebble, rock, twig, or leaf and can be efficiently stored in memory. However, when writing those to USD with the current exporter in Blender, they are all written as individual objects. I have added some experimental code for creating references, but to do this efficiently the previous issue — what to store in which file — should probably be addressed first.

Finally there are all the features USD supports that we simply haven’t gotten around to building into Blender yet. To mention a few:

  • Richer material support: currently we only store diffuse colour, metalness, and roughness from the viewport settings of the material.
  • Authoring variants: the concept of variants, for example a flawless and a chipped version of the same tea cup, is deeply engrained in USD, and “supporting USD” should take this into account.
  • Skeletal animation: currently we only write the evaluated, deformed mesh. This is for a good reason, as it ensures that the animation you see in Blender is what is actually written to USD. However, writing skeletal animation would open up interesting avenues of collaboration, for example for authoring game animations.
  • Light emission from mesh surfaces,
  • Ways to influence USD-specific properties; for example, lamps in USD have a colour temperature setting that Blender does not have.
  • Importing USD files, with everything that comes with it,
  • … and many more


USD support in Blender is still very much Work In Progress. Proper support for an USD-based workflow takes a lot of effort, UX, UI, and software design, and I’m guessing a few developers can easily enjoy themselves for a full year on this. If you’re interested to give it a try, check out the sybren-usd branch (you’ll have to build Blender yourself though). If you have any knowledge or experience with USD, please contact me at [email protected] or the blender-coders chat channel as I would love to pick your brain.

Update 2019-07-26

Fluid simulation run in Blender, exported to USD, and shown in the usdview application.

I’m blown away by all the positive response. Thanks everyone who sent me tips and suggestions for improvements. Since posting this article the following has been improved in my branch:

  • Added exporting of lights (the blog post was already updated for this, but I wanted to mention it explicitly).
  • Removed the “uv_” prefix from the names of the exported UV maps. This allows you to use special naming if you’re using other software that expects it (apparently Maya handles the name “st” in some special way).
  • Added exporting of mesh normals. This includes marking the exported mesh as polygonal mesh instead of Catmull-Clark subdivision surface, which is necessary to properly show things like custom normals.
  • Tested the merge of the MantaFlow branch and my USD branch, and it worked flawlessly.
  • Added exporting of mesh vertex velocities for fluid simulations. This is now only done for Blender’s current fluid simulation, and not for MantaFlow; I don’t want to have any tight coupling between the two branches.

You can try some of the USD files we created with Blender and see how they work for you. Over time I’ll upload some more files there. Please check the timestamp of the file against the changes in my sybren-usd branch; you could be looking at a USD file that has some irks that have already been addressed in Blender.

  1. Hello,
    Since USD Lib from Pixar gained python 3+ API support recently, I’m wondering if this API could be integrated in blender python. I understand that this USD exporter project is not implemented in python, but if add-on developpera had access to blender API on one hand, and to Pixar USD API on the other, this would allow them to produce exporters fine tuned to specific workflows.

    Off course, general purpose exporter as you do is still very useful for most users.
    But giving access to the Pixar API would allow every one to export precisely what they need.

  2. Wow this really got a lot of interest! I’m digging into Houdini at the moment, and especially the USD/Solaris components. I suspect that things would be much easier if we were to view USD in Blender as primarily subassemblies. I don’t (yet) see why we would have to change any of the underlying datamodel for Blender to match up with USD. They keys seem to be the way that we deal with the generating of paths and assets, and to a lesser extent the instancing. It also seems like you could call scripts out to the USD tools to compact the usda files to usdc and take care of the instancing there. Either way, I hope this continues, I would love to see blender interoperate more with other pipelines.

  3. is there any update on this? Is there a timeline check sheet to follow?

  4. I want to ask ,what about Hydra? it’s opengl and also integrated with USD, coud it be made as an engine like workbench or Eevee?the code is there so it would be a shame to throw it away.

  5. Exporting something to USD is a first step.

    Supporting all possibilities that USD allows would need a totally revamped software and data architecture.
    Houdini did so for Solaris, part of their upcoming Release 18.

    The misconception here is probably: USD is not only a file format, but a whole anim/geo data management concept as part of the USD C++ library that also has its own file format.

    Some of us would want to export to a USD-based studio pipeline.
    Others would want to export USDZ archives for use in the Apple Ecosystem with AR-Quicklook and RealityComposer.
    Both involve groups of USD files and textures but totally different ones.

  6. any news on USD ?
    Please give some help to Sybren to implement USD nicely and smartly, it’s clearly the standard that with take the lead (it’s already starting to)

    generally speaking, I’m not sure that the dev team priority are always on the right spot.
    I really think that blender missed something by not integrate USD, openVDB and OSL directly in 2.80.
    as Cobra says, If blender want to be in more production environment, one day a revamp of the software will be needed and 2.80 was the right time. Anyway, it’s not too late but the effort needs to be consequent

  7. What’s the end goal here? just to import/export or will be able to have USD full support with editing..etc i know that might require more developers to work on it and probably a big revamp to Blender and it’s renderers AFIK Autodesk is doing the same not only for USD but other Open Source projects like MaterialX, OpenVDB, Alembic and getting big support from Nvidia,Animal Logic and other Studios they are too pushing for the next decade or so, it would be a shame if BF abandon this project & other OpenSource Standards.

  8. Some software like Houdini are moving their core scene description to USD base, to ensure optimum perfs as well as a new karma render that works based on USD too, enabling real-time rendering without conversion.

    Is blender moving this way ?

  9. USD + Hydra + EEVEE/Cycles would be immense. We could render MASSIVE scenes. Similar to Clarisse and now Houdini.

  10. Are there any plans to take advantage of Hydra? Having Eevee and Cycles as Hydra delegates would be a boon to the OSS comunity while also allowing blender to work with any other renderes which have provided delegates.

    • There are no such plans at the moment. It would take a massive amount of work to change Blender’s internal data structures to match USD, which we would need to do in order to properly use Hydra. Without this, all data would have to be converted from Blender to USD, then passed to Hydra, then converted back to Blender-like structures for EEVEE or Cycles.

  11. Would be AWESOME if Blender could act as a modeling/layout/animation tool in concert with e.g. Houdini only using USD:

    For this Blender needs to be able to read a shot USD file and modify USD right inside the file in a similar manner as in the video above, then Blender could take over any task in the pipeline.

  12. Hi!
    USD export sounds amazing and by the list of what is there now it sounds like enough for me to try it :)
    I use Clarisse IFX ( and it has USD support so it would be great to try Blender USD export with it!

    BUT building Blender is too difficult/complicated for me so I humbly ask if you could provide the compiled version for me/us?

    I fully understand USD support is still at its childhood but as I do all 3D stuff for fun, no business, it would be great to be able to try it and of course act as alpha or beta tester and give reports to you (I have done that for many software) :)

    I understand if you don’t want to do that as code in development is changing a lot and making some kind of daily build (even once) needs work from your side.
    It’s just that OBJ is slow and Alembic has no material assignment info so USD could be really great.

    I tested the sample Blender USDs and they work fine except that coordinate system is wrong but that is of course just setting in the exporter (need Y up). But other than that (even material assignments) those files seem to work so to me it looks your USD export is ready enough for me :)

    Thank you and keep up the great work!


    • Hey Antti, you can use the below link. This is a Blender build from GraphicAll for Sybren’s USD branch.
      So you can directly download and begin testing.

      Hope this helps!

      • Thank you VERY much! Just tried it and it seems to work with Clarisse, only the coordinate system is wrong but as workaround I can create Locator (Empty) and parent everything to it and then rotate it Z -90 :)
        There seems to be kind of ’empty’ for each object, I don’t know if that is normal for USD but it is not a problem. Also does USD support Collections as folders/contexts or is everything always in one list?
        Material assignments come through except when there is only one material per object, then it is always ‘default’ :)
        Anyway already now this is BIG help and will be incredible when it develops further :)

        Thanks again!


    • > coordinate system is wrong but that is of course just setting in the exporter (need Y up)

      The coordinate system is just fine (Z=up is so much better than Y=up). Joking aside, USD supports both, and the USD files written by Blender actually declare that they use Z=up. The files are all valid USD. It’s up the application importing the USD files to properly handle both cases if they want to declare proper USD support.

      • Well, Blender is pretty much the only 3D software that uses Z-up (not only but close) and all other exports in Blender has the axis settings (and/or they default to Y-up) so please add that to USD as well :)


  13. With USD support in the works, what about MaterialX? It looks like these two will be working hand in hand? Quickly reading through materialx specification, It seems that it wouln’t be that hard to add import/export support for Blender as the existing Cycles/Eevee nodes are already pretty close match to MaterialX spec. imho.

  14. Hello. I’ve been observing the Blender Developer Blog for over a year now and I love the periodic updates. From one person to another,I wanted to ask how you learned to code. I want to,but I haven’t had much luck and I don’t have the money for any of CG Masters’ or CG Cookie’s tutorials. If you could give me any words of advice,I would appreciate it.


  15. Awesome work, Sybren!

    I have also been working on USD export in blender, with a goal of enabling one-click export to USDZ for use with iOS AR Preview and the new “Reality Composer” format in iOS 13 (I think it’s also supported on Android for AR). For this use-case you might want to export your whole “set”, or just a selected object, to a self-contained USDZ archive.

    A secondary goal is having an export format for blender that can preserve subdivision surface creases, as an efficient way of transferring mesh data for use with real-time GPU subdivision with OpenSubdiv on mobile devices.

    I think your approach with using the c++ code directly vs the python bindings is better in the long term compared to my hack where the exporter communicates via a JSON protocol between blender’s python and python 2.7.

    I’m happy to donate my dev time to this effort, since this branch would be a lot smoother to land in official blender. I’ll follow up in the chat channel to see where I can help.

  16. Hello Sybren,

    extraordinary effort and result, bravo! Blender’s push for USD support is super important and I believe both file formats can coexist, they don’t have to replace each other.

    To get a better idea of how it’s used by studios in production, you might want to look at Pixar (USD in Presto), Dreamworks Animation (USD with Alembic), Animal Logic (USD with Maya etc.) all grounding their pipelines on it.

    With USD Blender users could collaborate on assets and shots in really fast and efficient ways building really complex scenes without the burden of current pipeline miscommunication.

    So, congrats Sybren! Is there a way to follow your progress with USD?

  17. We’re integrating USD into our pipeline here at Fox VFX Lab. It’s got so much potential to provide a solid approach to get around the limitations of FBX. It’s great to see that Blender development for this is in the works!

    • I see from your website that you do most of your professional work in Maya. What kind of stuff do you use Blender for? What do you think about it?

      Anyways, that’s an impressive portfolio.


      • I currently work as the Lead Technical Animator/Supervising Animator for Fox VFX Labs (now part of the Disney family). Since our pipeline is very movie driven, so far we’ve been a Maya studio. Yet with the latest underwhelming releases with 2018 and 2019 I gave Blender 2.8 a look and was floored with how much progress had been made to have it be a production ready content creation application.

        I’m in the beginning stages of talking with my engineering team about having it in our pipeline as an optional rigging and animation tool with the eventual goal of dropping Maya. Epic and Ubisoft’s funding is paving the way to getting blender the professional attention it deserves IMO.

        • Thanks for the reply! This is very cool. I think you’ll be pleasantly surprised by the animation experience in Blender. Simply having a 3D cursor is such a huge advantage. The tool you show on your website that makes new pivots in Maya is awesome, very impressive. But I don’t think it would be necessary if Maya had a 3D cursor (well, it handles rotation and scale in a really different way because of its pivot points). Especially with the example you show of manipulating an IK foot- I use the 3D cursor for that task and it’s a lot faster. And of course, there’s a lot more, besides!

          Recently I’ve been experimenting with Blender project version control using Git. I’d be happy to share some of my experience if you email me at [email protected]. For what it’s worth. I’m not nearly as experienced as you are, but I’ve been using Blender for a while and I think I’ve figured out a good way to keep a project clean.

          • Oh I love blender! I’ve been using it for freelance projects for about 8 months now and have found the animation system is FAR more comprehensive than Maya. With Maya you need tons of scripts and dev time to get to a base level where Blender has been for years. I love the ability to pick controllers and objects that overlap each other and the manipulator free rotation and move modes. There are some quirks and things that have to be thought of in a different way (namely getting rotation copied across pose bones and how constraints spaces work), but I plan on diving into the python API and hopefully porting some of the tools I have written over my career to close the gap. Moreover, I’m working to get our animation team using it in some capacity at work ;)

          • Very cool! I’ve only used Maya a little bit and I was worried my tepid feeling toward it was just because I wasn’t getting it.

            I’ll link you to an article I wrote on my website about using Python in rigging: — might help some. I plan on writing quite a few more of these once I find the time, but it’s been about three months since I’ve updated my website.

            Blender’s Python API is flexible enough that it’s easy to use it wrong if you don’t know what you’re doing (like using bpy.ops instead of, or relying on context to supply information when you can get it directly), and a lot of useful stuff isn’t immediately obvious unless you know where to look- like using the console for debugging cyclic dependencies. Some things will give you weird bugs if you use them wrong, like trying to access edit mode data in pose mode. Anyhow, I gave you my email if you want to reach out to me about any of this.

  18. Looks like you know where you’re going ! I’m happy to see such understanding of « industrial » workflows. Keep the good work Sybren !

    Alembic and USD shares some concepts. As you states, if generic exporter can be improve and Alembic supports those features, that would rocks !

  19. Has anyone thought about turning Cycles into a Hydra delegate? :)

    • Using Hydra requires to keep the USD stage loaded in memory, and if you import a part of your scene as USD, but the rest comes from another package using Alembic (or whatever else format) then you won’t have that in your renderer.

      Hydra really only makes sense when you’re using USD as you main internal data structure. This would be interesting if Blender was dropping it’s own internal data layout (DNA/RNA) in favor of USD, but I don’t see that happening soon :)

      • Especially after spending years overhauling the dependency graph for 2.8, I don’t think now’s a good time to make a big switch!

        What I’d like to see is an EEVEE that’s as fast as Hydra.

      • That would be the best decision ever though.

        USD based Blender + USD compliant Cycles/Eevee would mean kiiling real-time performances and futur proofness.

      • Main benefit of Cycles Hydra Delegate imo is to allow cycles to be used outside of Blender. On standalone USD files without having to import into Blender. These USD files could have massive scenes and timelines, and rendering directly from them would be more optimized. Would also allow Cycles to be used by other DCC switching to USD such as Houdini.

        Also enables Animal Logic USD Maya – like workflow. They stream pieces of a USD into and out of Maya, allowing quickly editing USD files directly. Allows multiple DCC’s to collaborate, such as Maya and Houdini which specialize in different things. Then they don’t render through Maya, they render the USD direct via Hydra.

  20. Hi, what about the gITF exporter/importer? Could be related to this as it’s got some similarities.

    • I’m not familiar with glTF. That is, I know roughly what it is of course, and broadly what it’s used for, but I’ve never read the specs or dived deeper into the technical details.

      I’m working on an abstraction layer for exporters that need to (be able to) work on an entire scene, keeping in mind that the Alembic exporter could at some point be rewritten to use that layer too. Maybe a future glTF exporter could too.

    • glTF is a transfer format for opengl like content (hence the name), for game models for realtime rendering .. in a straightforward and compressed format suitable for the web.

      USD seems to be more like Alembic for exporting movie scene geometry for use / rendering elsewhere.

      So they are maybe quite different (glTF materials are glsl etc, usually PBR now) though indeed there are similarities as both a scene formats which are great to have support for in Blender.

  21. It is a bit strange, that meshlights supported and spotlights not.
    Temperature for lights can be assigned with node «Blackbody» in Cycles, so this temperature can be taken for USD

    As far as USD is open format, I hope it is open for suggestions and improvements.

    Switching variants by animation is still not implemented in Blender yet sadly. It is useful feature for making LODs or imitating stopmotion clay animation

  22. Super exciting, Syben! We’d welcome any questions you have on the usd-interest forum.

  23. I think you shouldn’t try to convert a full existing set to USD, it’s not the purpose of it.
    Instead, we should be able to export independent asset as unique USD file. ( I think this part already work). And then add tools to blender to link this files in another scenes (like a linked group). Tools should allow overrides on USD asset, and the final shot export should only point to the USD assets and keep this overrides. If some assets are not USD linked file (like a rigged character) then you can export it as a separate USD file and link it to the shot USD.

    • Well yes, but that implies a fully-fledged workflow of having USD and non-USD data flowing into Blender at the same time and having excellent control over what to export where. Not something we’ll have any time soon.

      • I agree it’s a big job, but I think you have already made a big step!
        What I mean is the next one could be linking USD file into blender (only supporting mesh, transformation and hierarchy). With this we already could use it in production

  24. Amazing.

  25. “(except spot lights, which USD doesn’t seem to support)”
    Using UsdLuxShapingAPI allows you to implement Spotlight features:

In order to prevent spam, comments are closed 15 days after the post is published.
Feel free to continue the conversation on the forums.