EPISODE 1637 [INTRODUCTION] [00:00:00] ANNOUNCER: Blender is a free and open-source 3D graphics tool that was initially released in 1994 and just hit version 4.0. It's one of the triumphs of open-source software development and is used for creating animated films, art, 3D games and more.  Sybren Stüvel is a senior software developer at Blender. He joins the show today to talk about the history of Blender, its path-tracing renderer, managing large-scale render farms, the Blender data structure, Python scripting in Blender and much more.  Joe Nash is a developer, educator and award-winning community builder who has worked at companies including GitHub, Twilio, Unity and PayPal. Joe got a start in software development by creating mods and running servers for Garry's Mod. And game development remains his favorite way to experience and explore new technologies and concepts.  [INTERVIEW] [00:01:01] JN: Welcome to Software Engineering Daily. I'm your host for today's episode, Joe Nash. And today I'm joined by Dr. Sybren Stüvel. Sybren is a developer at the Blender Institute where, amongst other things, he leads animation and rigging. Welcome, Sybren.  [00:01:14] SS: Thank you for having me, Joe. Yeah, it's a pleasure.  [00:01:16] JN: Well, before we get into - I want to get into your background and your journey into Blender. But for folks who are listening to this who are unfamiliar with Blender, I want to start with a really brief intro. Could you tell us a little bit about what Blender is?  [00:01:28] SS: Well, Blender, is basically a big 3D workhorse. You can do modeling but you can also do sculpting. Well, it's kind of a way of modeling. But you can do sculpting. You can do texturing. But, also, animation, rigging, of course. And it basically is a full pipeline. You can also do video editing in there. You can render out - it has different render engines that are aimed at different kind of looks and different techniques, which lead to different rendering times. And so, it really allows you to start from scratch from an empty 3D world and end up with your own film.  [00:02:04] JN: Perfect. Awesome. Yeah. We'll come back to a lot of those topics in due course. But first, I believe you've been working with the Blender Institute for a really long time. Or you've been surrounding Blender even longer than that. Can you tell us how did you get first involved with Blender?  [00:02:19] SS: Well, before that even, I started using Blender myself. Before that, I was dabbling a bit with LightWave, which I really enjoyed. And back then, Babylon 5 was on TV and they use LightWave actually for all the space shots. The ability to recreate those spaceships in your own computer and make your own little videos with it, that was fantastic, especially because they didn't do that much in terms of texturing. They just used the built-in Voronoi textures and that kind of thing. You really could make it look like the TV show. At least back in the skill level that I had back then, for me it looked exactly the same, which was fantastic.  LightWave was - it was kind of special because it had different modes. I think it was tab, or enter, or something. You press that and then you get - instead of arranging your scene, you were starting another app that allowed you to edit your model. That I found really useful because then you're free of all the clutter of that entire scene and you could just polish that model. And then with the same key, go back to your entire scene again.  And so, after that, I tried 3D Studio Max, which drove me absolutely bonkers because it doesn't have that. And so, you're working on your model and all of a sudden you selected some objects in the far background and you're now working on that.  And then I hear news about this Blender thing that was just struggling to become open-source. It was actually one of the first crowdfunding campaigns on the internet that made it possible for Blender to become open source. I didn't know about the history before that. But by now, I do, of course.  Ton Roosendaal, the founder of Blender, he has had a few different companies in the past. And that last one where he made Blender, that actually went bankrupt. And so, all the possessions of the company were then owned by the investment company. And basically, it was a crowdfunding campaign to get I think, my heart, 200,000 Euros in order to pay the investors and they would then agree to release Blender as open-source. [00:04:23] JN: Sorry. Just to jump in. And so, what kind of year was that? Just to get a scale of what the 200,000 -  [00:04:28] SS: This was 2003.  [00:04:29] JN: 2003. Okay. Awesome. [00:04:30] SS: Yeah. And so, that's where I first started using Blender. And that, again, had these edit mode and object mode. And so, you could - what I liked so much about LightWave I found back in Blender, except then it was actually legal for me to use it. That was a nice bonus.  [00:04:48] JN: Excellent. [00:04:48] SS: In 2004, I filed my first bug report. And that was pretty much for years my involvement with Blender until I did my master in gaming media technology. And there's a course about 3D modeling. And one of my fellow students, a friend of mine, told me, "Hey, listen. Blender is in Amsterdam. And we can visit. They're working on a new short. And on Friday, we can actually visit."  Of course, we did. And we rang the doorbell. Ton Roosendaal opens the door. Big smile on his face. Welcomes us in. And that is the first time that I actually saw the Blender Institute. [00:05:24] JN: That's amazing. [00:05:25] SS: Yeah. It was amazing. It was fantastic. The atmosphere was really good. There were people pranking around. They were working on Sintel at the time. And every time the rigger got his hands on the model, he changed the color of her eyes, which was hilarious.  Fast forward a bit more, I've done my master and I'm working on my PhD. That was doing crowd simulation. And I wanted to do that in the Blender game engine. Because then you could record the movement in the game engine of the simulation. You could record it to animation data. Go back to Blender itself and then replay what it recorded. And that was a really useful tool for me as a developer to go back to the point where it started going wonky and then trigger that particular wonky crowd agent to explain its reasoning.  And because all of that was in the same environment, that was super convenient to work with. But Blender couldn't do what I needed to do. I needed information about collisions between those agents. And I wanted to know where exactly that collision was. And Blender did not give me that information.  I knew that because Blender was using the bullet framework underneath, I knew that that information should be able to - it was there. I knew the framework would give it to me. And that was actually my first functional patch that I sent in to Blender to get that info out there.  After a few patches, I started nagging more and more. Like, "Hey, is it in yet? Is it in yet? Poke, poke, poke, poke, poke. Is it in yet?" And then they got annoyed by me. And it was like, "Here, you have your own commit access now. You just do it yourself." That is how I became a Blender developer.  Before working there, I was already helping out with a project. There's also the yearly Blender Conference. And since 2011, I've been to every one of them. Of course, being a student, I didn't have much money. And back then, you would get your ticket for free if you like had a talk. I made sure that I had a talk.  I think, since then, I've never paid for more than two conferences, I think. Because I wanted to be part of that community and part of the life there. And that, again, for me was a big influence in wanting to work at Blender because it was so pleasant and so full of energy. And everybody is so happy about seeing each other, about working with Blender. And also, make sure between Blender developers and Blender artists all at the same conference mingling with each other, getting inspired by each other. That was a fantastic atmosphere. [00:07:52] JN: It sounds fun. Yeah.  [00:07:54] SS: And so, I started going to more and more of these weeklys and massaging Ton more and more about - so that he understood what I had to offer and that I really wanted to work there. And in the end, it worked out. Since 2016, I'm working at Blender Institute first as a web developer. Because then Blender Cloud was just there I think for a year or so. And that was a source of income for the studio. Because that is what it was built for. To provide a more stable income source for the animation studio. And so, because I could do web stuff, I could start working there. And slowly, I maneuver myself towards more and more Blender development. And that's what I do all the time now. [00:08:36] JN: Perfect. Yeah. And so, I want to talk a little bit about that maneuvering. Because, first of all, amazing showcase of where committing to the software that you use to scratch your own itch can get you. That's really fantastic.  Now, as we mentioned, you lead the animation and rigging module. Did you end up in that space because of the work you've done to your master's or PhD? Or is this like another unrelated journey?  [00:08:59] SS: No. It is related. And it's not a coincidence that I did my master's in that area. Because I really, really enjoy animation. I love watching animators work. I'm not that good at it. But I love seeing them work and seeing from like a character that is in walking stage and it's just moving from A to B. To seeing an actor. Something with a soul and a motivation behind that movement to go from A to B and see that come to life. That I feel is - it's still magic. Being able to participate in that process and being able to help people achieve that, I think it's really nice work. [00:09:37] JN: That's wonderful. I want to dig into - you mentioned a bunch of Blender features at the beginning. From the modeling, to the sculpting, to the various rendering techniques. I want to dig into a little bit of what blender enables people to do. And I guess one area that I would like to start - oh, I guess actually to take kind of a holistic view. Say I'm creating a short film on Blender and I'm doing the whole pipeline in Blender. What is the sequence of features that I'm going to use to put that together?  [00:10:03] SS: Ooh. If you're really enticed to do everything in Blender, then I think you would start with Grease Pencil and using that to create storyboards.  [00:10:12] JN: Okay.  [00:10:14] SS: You may want to do that still on paper because it's less tempting to start making things pretty. And at that stage, that is very important that you keep things ugly and very very fast to iterate over. But you can use Grease Pencil for that for sure.  Then you may want to cut that up into sequences and shots and start planning out those shots. Start building temp. Everything. Temp, temporary sets. And make an inventory of which locations do we have. What do they look like? What do they feel like? Build those sets. Build your characters. Design your characters. Probably, again, might not do that in Blender but rather in Krita or pen and paper to just start on the design of the characters and the look dev basically.  [00:11:02] JN: Sure.  [00:11:04] SS: And from then, I would say start building stuff and blocking - making more proxy objects that you can play around with in 3D space so that you can work on layout. Maybe start splitting up files into different blend files that you can link in into the shot file. And then add some overrides on top so that you can move things around. And then you can play with your storyline and you can slowly start replacing those drawings you made for the storyboard and replace them with actual 3D rendered, ugly-looking, super-plain, boring, low-poly stuff. And then you'd draw the rest of the owl  [00:11:47] JN: Yeah, of course. Yeah. Yeah. Absolutely. Yeah. Speaking in some of that drawing, there's the owl. Obviously, what you work on is animation and rigging. I and I think most people I know have at this point played modeling tools in Blender. We all did the donut tutorial during the lockdown. And I haven't got as far as animation and rigging. What are some of the tools and toys available for content creators and developers in there?  [00:12:09] SS: Well, one like the most important tool is, of course, setting a key frame. You can keep pretty much anything in Blender. The current incarnation of Blender's animation system came from Blender 2.5. That was years ago. But their big shift was to make everything animatable instead of only these specific properties that were opened up for animation. Everything was animatable except where it had a flag that would prevent that.  [00:12:41] JN: Okay. Sorry to pause and dig. Folks who haven't been in Blender, what does that mean? What are the things that are animatable that might be unexpected from before that transition?  [00:12:51] SS: Oh. Well, pretty much all material properties cannot be animated. But there is also like visibility properties of things can be animated. You can animate a cube that hides and unhides. And that is still one of the more annoying things to animate at the moment because of reasons.  [00:13:09] JN: I mean, yeah. I guess that's more than an opacity shift, right?  [00:13:12] SS: It is. It is. That is true. When you animate the actual visibility of an object, other things come into play. Because you cannot select it anymore. You cannot see it in your 3D viewport. And that means that if it were still selected, you might influence it. You might select another object and start moving things without knowing that you still have that hidden object also selected and also moving around.  That is one of the usability rules that we have in Blender, is that when something is invisible, you shouldn't be able to manipulate it, which works out pretty well in most cases except for when you want to animate the visibility of that object. Because then once it's invisible, you cannot select it anymore to make it not invisible. There's ways around it but it's a bit annoying. That's one of the things that we have in mind while working on newer animation stuff.  What I was saying, you can hover over any property in the properties panel or in other areas of Blender. Press the I key on your keyboard and that will set a key for that particular property. [00:14:13] JN: Amazing. [00:14:14] SS: And you can change time, hover over the same thing again, press I again and then you can manipulate things over time. A.K.A. animate. [00:14:23] JN: Very cool. [00:14:23] SS: Other tools that we have are auto-keying. You press a record button and then where you move things around in your scene, that will automatically key them. And so, you can pose your character. Go to another frame. Pose it. Go to the next frame. Pose it again. And you don't have to hover over whatever property you want to key. You don't have to press the I button. You don't have to do anything. It would just record everything.  And for many animators, this is their go-to default because they don't want to think about keying particularly. They just want to think about character actions and animating things instead of the whole technical part of creating those keys.  [00:14:59] JN: Yeah. That immediately took me back to warm, fuzzy, flash animation land. That made me very happy to hear that.  [00:15:05] SS: Yeah. And I would say another important tool that we have, which is also very trivial for anybody who has done animation, but I just want to mention it, is the interpolation options. Every key can say what happens in the span of time until the next key. And that means that you can have a linear interpolation. Or that just goes from A to B in a straight line by default. That is a Bézier curve. It smoothly goes from value to value.  When you're trying to look at your rough animation, that is super getting in the way though. It looks nice. It looks super smooth and like you've done a great job. But the problem is that you never know when you're looking at a key that you said. And when you're looking at a key that was generated by Blender simply because it had to go from A to B. And so, that is also like the blocking that I mentioned earlier. That refers to the shape of the graph editor. When you start blocking out your animation, you set all the keys to constant interpolation. And with that, you only see those poses that you made yourself and it makes it a lot easier to see how dense those poses are. What the sequence of them is? How they fit well together? And if you don't do that, your animation will just be floating around in space and it's very hard to see what you're doing. [00:16:24] JN: Fascinating. [00:16:25] SS: That's a few of the tools. I mentioned the graph editor. That shows all these properties and how they change over time. We have the Dope Sheet. That doesn't show how those properties change over time. They just show the keyframes. If you want to have them a bit more - yeah, a different overview, that's a little bit cleaner. That allows you to show more key frames on view for retiming purposes and that kind of thing, then the Dope Sheet is really handy.  [00:16:50] JN: Mm-hmm. Awesome. And so, I guess that's on the animation side then. When it comes to, I guess, things like character rigging - I have a very only passing understanding of rigging from a little bit of game dev experience. But what goes into character rigging for Blender?  [00:17:03] SS: Well, I think the armature is pretty standard in character animation. Not all 3D animation software actually uses armatures in the way the Blender does it. And I think the way Blender does things is quite good.  [00:17:16] JN: Okay. And how is that?  [00:17:19] SS: Well, first, you would create your model and have that like in a T-pose, or an A-pose or something that is easy to rig up so that like the left arm and the right arm are in the same pose. And then you create the armature. That basically is a thing that consists of bones. And those bones, they connect to the mesh.  By default, the closest bone is connected to that part of the mesh. And they have a nice fall-off to the next bone influence. And this allows you to just rotate a bone that takes along just like real life, real characters takes the flesh and the skin along with it. And that saves you a lot of time in animating because you don't have to move every vertex of your mesh by hand. You can just move them as one group and just manipulate those bones.  And that is what is called the deformation rig or the deformation armature. And that is just moving bone moves part of that mesh. Now this is not always the best way for an animator to work. What you typically put on top of this, more bones with some custom shapes so that you can distinguish them and they can be shown outside of the body instead of always being inside. And that's the control rig basically.  And those can have more smartness in them that allow you different ways of moving. By default, those bones move in a forward kinematic way. That is you rotate a bone and then all its children that are connected to it just rotate along with it in unison. And this is great for life-waving motion. Because you only have to like rotate the underarm and the rest of the hand goes along with it.  But for putting a hand somewhere to pick up a cup, for example, it's much easier to grab that hand and then let the other bones rotate so that that hand follows everything. And that is called inverse kinematics. And with Blender as it is now, you need to set up different chains of bones for forward kinematics, for inverse kinematics and have some way to switch between the two. Snap one to the other. That gets a little bit more complex.  Fortunately, Blender comes with a tool called Rigify, where you can just say I want to have a standard human. And that creates all sets of bones for you that you can then scale, and adjust and move around so that it matches your character model. And then once you've basically told Rigify what your character looks like, you click on a generate button and then it creates all the switching between forward and backward kinematics. Turning on and off different layers. It adds more detailed control to the face that you can turn on and off. When you're blocking, you just have to have motion. When you're working on face animation, you can turn on the things that you need at that time. And that is a quite nice way to start working with animation.  [00:20:17] JN: I have so many questions about how that works.  [00:20:16] SS: That takes away a lot of the complexities of this. [00:20:19] JN: Yeah. I have so many questions about how that works and how that generation works. I don't know how to start. I guess one thing, I guess, I'm very curious about is like how the defamation between the actual model and the armature works. How the armature is attached to the mesh in the implementation wise. I guess it's just like moving vertices, right? Or translating vertices.  [00:20:40] SS: It is. Interestingly, it was in I think '86 that there was paper about - that was the first paper about skeletal animation. And that was specifically made for hand animation so you could put bones in the hands and then the vertices come along with it. That is still basically the technique that we use now, which is quite interesting.  Technically, what you do is you divide - well, you assign every vertex to one or more vertex groups. In your mesh - and Blender can do this automatically for you. The end result in your mesh is that you have a vertex group for every bone, which is just mapped by name. And every vertex in the mesh can have a certain weight. It can be influenced by that vertex group for 100%. Or it can be in that vertex group, and that one both for 50% and then you get like a smooth fall off.  [00:21:31] JN: Interesting. [00:21:32] SS: And then when a bone moves, it has a certain transform between its rest pose and actual current pose. That could be moved one meter to the left or rotate by this amount. Usually a combination of the two. Maybe some scaling involved. And it's that transformation that brings it from its rest position to its current position that is then also applied to all the vertices in that vertex group. And by doing all of that, you have your deformation. That's how everything moves along with it. [00:21:59] JN: Very cool.  [00:22:01] SS: And this is also something that we want to address. Probably not in the next year. Probably not in the next year and a half but maybe after that. Because I just said, we have this rest position of the armature and then you have to the current position. And that rest position - there's only one rest position currently in Blender. And that means that the way you model your character, that is how you put your bones in the mesh. That is the rest position of your bones. And that is also the reference position for all animation.  This doesn't necessarily have to be the nice way to go. For example, when you make a 3D scan of an actual person, no way that they have their arms exactly in the same - mathematically exact in the same place. But then just mirrored. And no way that they're mathematically perfectly symmetrical. Even if they were, 3D scans are noisy. Their output model won't be.  And also, for animating, it helps when the rest position is like axis-oriented so that the arms are really going straight left and right. For 3D scan, your scanned model will stand much more comfortably when the arms are a bit more. Like an A position. Also, for deformations that might actually be a better starting point. And that means that we want to have different rest positions. One where you put the - you have the binding pose basically for the mesh where all the bones are in the neutral position where they can be bound to the mesh. And then another pose where you can just move those arms in perfectly straight axis-aligned orientations and have that as the reference pose for animation.  [00:23:38] JN: Understand. [00:23:39] SS: And, yeah. That's like one of the things we really want to work on at some point.  [00:23:43] JN: Interesting. Okay. Cool. Yeah. Thank you for scratching that itch I've had for a very long time. The other thing I wanted to ask about is you mentioned forward kinematics and inverse kinematics there. And that's another area that I think lots of people have heard of. But I would be really interested in how the calculation of like, "Okay, I've moved this bone. How is that transform rolling forward or backward as is the case?" What is I guess the algorithm that kicks in there to make those translations occur between those bones?  [00:24:08] SS: Forward is very simple. Every bone has a translation which we call location in the interface. But mathematically speaking, it's a translation, a rotation and a scale. Those three are combined in the right order to form a matrix. And so, every bone has a transformed matrix.  And forward kinematics is just multiplying those matrices from the parent all the way to the child. And then you have the final transform of the child. That takes into account how all the parents transformed before that. And then you can apply that matrix to the vertex groups and then you're there.  Oh, going back a little bit to how the mesh deformation works. This was one of the two options that Blender currently has. We also have the dual quaternion-based, which is better at keeping the volume of the mesh constant. Because if you have like a twisting motion like this, say you have an arm here and you twist it like that. If you do it nearly, you will get what we call candy wrapper artifact.  In the middle, it would just squeeze down to nothingness. And the dual quaternion system works around that. It has some other downsides. But you can switch between the two.  [00:25:21] JN: One of which you have to work with quaternions. But, yeah.  [00:25:24] SS: Well, you don't have to work with quaternions to use the dual quaternion method. But that was a side step from the forward and inverse kinematics. The forward is simple. That is just matrix multiplications. The reverse of that, the inverse kinematics, is way harder. Because, usually, there's an infinite number of bone orientations that will lead to the final position of that end of factor.  And you can imagine that, when you're holding a cup, your elbow can be there or there and that cup is still in the same position. And so, what most inverse kinematic systems do is taking many, many, many tiny steps and then seeing how does this matrix need to be modified in order to bring the cup to the desired transform? And then taking small, small, small, many, many, many, many small steps and then hoping that that eventually leads you to the right positioning. But it starts out at the current orientation. It tries to minimize how it's moving around.  Again, there's multiple options in Blender with different implementations of this algorithm that do or to not take the current orientation into account. But in the end, they boil down to somewhat trial and error and having a mathematically smarter way of knowing roughly in which direction to go and then doing that a whole bunch of times. Refining the end result.  [00:26:48] JN: Fascinating. Yeah. The infinite possibilities of getting that original - the node you're working back from in position. That never occurred to me. That's really interesting. To jump to a different area a little bit now. One of the things you mentioned at the start, again, just like hitting all the questions I have about Blender. You spoke about rendering at the start and the different rendering options and rendering modes. I've heard the term Cycles. And, obviously, more recently, there's been Eevee. Can you tell us a little bit about these two different renderers and when each might be used in Blender?  [00:27:20] SS: Oh, sure. Cycles. First of all, a little disclaimer. I'm not a render engine guy. I might get like the finer details of the terminology wrong. Apologies for that. But Cycles is a path-tracing render engine. It shoots out rays of vision basically from a camera. And it just says, "Okay, what would I see if by look in that direction?" And then it shoots a ray in that direction. And it might hit something and then bunch of things can happen.  It could return, "Oh, this is blue." That's a diffused color. We can see that equally from all directions. I can just say it's blue. Or maybe it also has a specularity to it. And then that ray needs to know, "Oh, but what is being reflected then in that specularity?" It needs to bounce and continue on.  By the way, also for the diffuse, it needs to bounce and continue on because it needs to know is this part of this mesh even illuminated? Or am I bouncing off into a black hole where there's only shadow. Then it becomes black. Or do you bounce and hit maybe a huge, shiny light box or something?  And this is also where it gets a bit tricky. Because a diffused surface scatters in all directions. And you can only try one when you don't have a quantum computer. You can only try one at a time. That means that Cycles actually does try only one thing at a time. And it might hit that light. It might not. And that is why, when you stop a Cycles render very, very early, it will be very grainy and noisy. Because some of those rays may have hit that light source and become bright. But the neighbor pixel may not have hit that light source and stay dark. And that is sort of a randomized process.  The more you send out rays to the same pixel, 10% might hit the darkness. But 90% might hit the light. And then you keep averaging over those samples. And eventually, you will end up with what should be the truth. And that instead is 90% illuminated because there's a big, fat light next to it. And that is why you see that grain when you start rendering with Cycles. It starts out very grainy and then you see that grain dissolve basically and turn into like a pretty image.  And the more time you give it, the prettier the image will be but also the longer your render job will take. And so, that's a balance to hit. Also, there's a lot of development in Cycles on guiding those rays to the more important bits. So that if you have like a small light source, that it still - it's able to find that small light source without having to randomly sample everything on the planet. Because the light source is very small. A random ray wouldn't hit it unless it's like exactly on there and there is like guidance in place to help the system out and speed it up. That is Cycles. It's shooting rays. Taking time. Getting faster over time as well. And getting really good realistic images. Because all the light bouncing and effects like that are being taken into account.  Then you have Eevee, which is another render engine that takes a much more game-engine-like approach. I don't know that much about it. But as far as I know, it does project - it has a shape in 3D space that it projects onto the screen. And then it knows, "Okay. These polygons take up this amount of space. That is why I have to draw the texture there." And so that is much more real-time engine. It is aimed at similar techniques like a real engine is doing. [00:31:06] JN: Sure.  [00:31:06] SS: Actually, when I saw the demos of a real engine, I don't know which version was out at the time, I was amazed by it. And it looked so good. And I was like, "My - what are we doing in Cycles? There is this engine out there that can do this in real-time." And we have to wait for so long. And then [inaudible 00:31:24], the creator of Eevee, he came to the studio to demo early, early, early versions of Eevee. I was so relieved. Because it's - yeah, it really helps us to move in that direction as well. And now both are there and both are amazing.  There is another render engine called Workbench Engine. And that is actually what you see in Blender itself when you're in your 3D viewport and you don't have it set to show the fully-rendered image. You just see the solid geometry. That is the Workbench Engine. You can also use that to just render out an entire film if you want. [00:32:01] JN: My question from that, which obviously unreal, a real-time rendering engine makes sense because it's a game. You're moving around. Whereas Blender is a movie. It's going to be static once you render it to some extent. When is Eevee used? When do you want a real-time rendering engine in Blender content creation? [00:32:19] SS: A very big difference. I think even from an architectural point of view of, an even bigger difference between Unreal and Blender is that Unreal is optimized for the gamers' experience. And that means that you can have quite a lot of precomputing going on. Because a lot of things won't change or will stay within certain parameters because that's how the game was made.  And so, while you're doing these pre-computations, while the computer is doing that, it can take all those things into account and optimize things and pre-compute things for you. Whereas with Blender, it's aimed at the creator. And so, Eevee has to do less - it can do less pre-computing. Because you can delete that mesh at any one time. You can be modifying the mesh. You can be texturing it. And so, it has to respond to that immediately instead of baking everything down into easily accessible stuff for the GPU so that the gamer has the good experience.  And that also hints towards like what is the usefulness for Blender users? Well, imagine texturing while it's being latent, while you have the smoke in there so that you know which part of the texture are going to be covered by the mist paths. Anyway, you don't see that much detail. And you can all just paint that in real-time versus doing that in a separate texturing application and then bringing it in and then seeing what it looks like.  Typically, you can make things in Eevee look really good. If you look at our production charge, that was made entirely in Eevee. I think it looks amazing. But also, when I tried it out myself, I find Cycles a lot easier to work with because it's just about placing lights, and textures and toying around with materials. Whereas Eevee needs a lot more handholding in terms of where it can gather reflections. Where it can gather other information. It needs that info to stay real-time or real-time to get fast. And so, it needed a bunch more handholding to get it to work well. That's my outside perspective anyway.  [00:34:29] JN: That's great. That definitely cleared up some of - yeah, some of the questions I had. A couple of times you've mentioned films that Blender themselves has made. And recently, I think, especially this last year, I've seen a lot of attribution of Blender to larger projects. I think RRR, which is one that obviously did incredibly well. But I've seen special acclaim about its VFX that Blender was used in that.  I know that aside animation rigging, you also work on some projects to do with large-scale render pipelines and that kind of thing. I was interested in what your - first of all, what does it take to make Blender useful for a full-feature film like those produced by yourselves or RRR? And, yeah, let's start there. [00:35:10] SS: Well, what makes Blender useful, it's a hard question to answer. Because it is useful. People do stuff with it. From a very boring point of view, but you're hitting on like large-scale render farms. Imagine using commercial software that you have to buy expensive licenses for and manage those licenses. Imagine running that on a big render farm where you might want to run in the cloud and spin up, spin down different render nodes depending on whether you need them or not. That's a nightmare. Also, imagine the purely hypothetical case where, for some reason, all of your staff, all of your artists have to work from home. They can't work. Unless there is an infrastructure that can check that their license is actually valid. They can't do their work if like that license server can't be reached. Or they are not allowed to install anything on their home computer.  Whereas with Blender, nobody cares. You just download it. Run it. And you're good to go. You just need access to your files. And you can always use it from wherever you want. That is a big advantage especially when you're looking at like these larger productions.  Another advantage of using Blender is that it does a lot of things. Instead of juggling all kinds of different applications for different purposes and making sure that everybody's using the same version of every application and has it working and running on that computer, you can just work in Blender for modeling, texturing, doing ink lines. For example, Spiderverse used Grease Pencil for part of the ink lines.  [00:36:48] JN: I did not know that. Fascinating. Okay. [00:36:50] SS: If you look at Spider-Verse 2, many of the ink lines were done in Blender with Grease Pencil. But also, the little bit of the Lego Universe, that entire scene was made in Blender actually by a 14-year-old kid. That is also what I find so inspiring about working on Blender and at Blender.  We're putting this tool into so many hands. We open up so many possibilities for people because it is free and because anybody can just download it, and use it and be creative. And it has applications also in the very practical field, like 3D modeling for the 3D printing, for example. You can start producing things. And it really opens up all kinds of different possibilities for people who otherwise wouldn't have been able to do this. And that is very inspiring and very, very humbling for me as well. [00:37:49] JN: Yeah. Absolutely. And I want to come back to that topic in a bit. I have some questions about accessibility on that. But whilst we're on this topic, you mentioned render farms. I know you work on another project called Flamenco, which is all about render management. Can you tell us a little bit about Flamenco and what is its use cases?  [00:38:04] SS: Flamenco is the render farm software that we use in the studio. And it has a long history going back to I think 2011. Some people worked on a tool called Brender, which was like a Blender-Render -  [00:38:18] JN: Genius.  [00:38:19] SS: And Francesco, who's been working at Blender for a long time, he also worked on that or at least on the successor of that. After Flamenco started, it had a bunch of different incarnations. And that is basically where I stepped in when I started working at Blender Cloud. Because we thought it might be a nice idea to integrate Flamenco with Blender Cloud.  On Blender Cloud, people could just start their own film production projects and have ass management in there. The user accounts were already there. Project membership was already taken care of. Hooking up some render management tool to that where the results are also visible on that same web interface within that same project, that was kind of tempting.  We did. And that was a nice tool that worked for quite a while. But we noticed, especially when talking with studios, that they weren't so inclined to use it. Because it's always tricky and sensitive when you're using it. You're using somebody else's tool on somebody else's infrastructure to work on your intellectual property that shouldn't be leaking out until the thing is actually released.  And even though we made it in such a way like Flamenco is still now, that you could use your own machines for rendering, part of it was still connected to Blender Cloud. And that was like hard to work with on different levels. Also, the part that you would host yourself on your own infrastructure was kind of tricky to install. And it became, over time, harder. Because we also didn't have that much time to improve things there with all the other stuff that's going on at the institute as well.  In early '22, we decided it was time for a new incarnation of Flamenco. We took out the Blender Cloud dependency. Basically, rewrote most of it from scratch. And in about seven months I think, we had a usable version. And it's like - it really focused on getting things up and running as quickly as possible, as easily as possible.  Tom also was very adamant about no databases. And I was like something like Postgres. It's easy to install. You just do apt-get install Postgres. You press enter. And then it's done. But we also had the focus on smaller studios and people at home. They should be able to use it. And it's not for these huge studios. Because, first of all, there's not that many huge VFX studios out there compared to smaller animation studios. It opens up a broader market. We're not selling anything. But still, more people can use it. And also, it simplifies a lot of things.  And so, that no databases, I was kind of, "Oh, Postgres is so tempting and it's so easy to install." Until I started installing it on Windows for the first time. And that was hell. So many next, next, next, next, next. Oh, now you have to make this really important decision that you have no idea what it actually means. Unless you really read up and want to be a Postgres maintainer, then it makes sense. But if you're just somebody at home who wants their renders - so I agreed with Ton. Like, "No. This is not what we want to do."  Now Flamenco uses SQLite, which is like an embedded database engine. You don't even need to have that database file. We just start from scratch when we delete it. Ton said make it as simple as running Blender. With Blender, you can just download a ZIP. Double-click on blender.exe and you have Blender. You have it running. And it should be that simple to get a render farm running.  And of course, that's very challenging. Because a render farm has to coordinate between all kinds of different computers. It didn't become that simple. But if you just want to run it on one computer, it actually is. You can just double-click the Flamenco manager.exe. It starts up your browser. It gives you introduction about how it works. What the architecture is? Ask for a few simple things. Like, where are your files? Where's your Blender? And then you have a working farm.  I'm slightly simplifying things here. But I did make a video, 4 minutes and 59 seconds long. It's on YouTube. And that is like just showing off, if everything goes well, how fast you can get it running on a single computer.  [00:42:39] JN: Perfect. We will find that and make sure it makes its way into the show notes. [00:42:41] SS: All right. Good. I'm glad you mentioned this. The whole time you're talking about Postgres, I was like, "Surely, this is going to end in SQLite." And, yeah. Perfect.  [00:42:51] SS: Yeah. There's still a bunch of things that I would love to update on it. We have some people in the community also working a bit on Flamenco. I can only spend like 10% of my time, unfortunately. The rest is all animation 2025 projects. Working on animation and rigging tooling, which is also a fantastic project to work on. But Flamenco is getting a lot more traction now that it runs purely on your own hardware. It has no connection to the outside world. A lot of more people are using it. And that is really, really good to see. [00:43:23] JN: Yeah. That's fascinating in itself. The psychology of open-source, free software and how you position things to be most used by your audience is very interesting. Earlier we were talking a little bit about accessibility. Because you mentioned one of the really inspiring things about working on Blender is all the people who can use it to create content. We mentioned the 14-year-old who had paved the Spider-Verse movie. And, obviously, there's artists. And your own background shows that researchers use it. What you've got here is like a deeply technical tool that has Python scripting embedded. I'm really interested in how you help make Blender as accessible and possible to artists and these other profiles?  [00:43:58] SS: One of the things that we did is what we have going on constantly, is sharing knowledge. And that is one of the goals of like the existence of the studio to begin with. There is the studio.blender.org website where our artists pretty much on a daily basis share their knowledge about how to do things in Blender. Because they're on the daily build of blender. They really are on the cutting edge. Sometimes the bleeding edge of Blender development.  And not only do they help us to move Blender forward. But also, once the tool is there, they investigate and they see how they can use it and what kind of crazy stuff they can do with it. And then share that knowledge on the Blender studio website. There's a lot of tutorials and practical information directly from the studio, like all kinds of production lessons.  And one of the training series that I make myself is scripting for artists. And scripting things is very powerful. But, also, many artists don't want to become a developer. They don't want to learn to program. But also, they don't want to do that mind-numbing repetitive stuff over and over again. And that is basically where scripting for artists hooks in.  First lesson is copy-pasting from the user interface, for example. Blender has a text editor where you can type Python code. You can press the play button, it runs that Python code. But what not many people know is that any button in the interface and any menu item, you can hover your mouse over it, press control+C. Go to that text, editor press control+V, it will paste the Python code that runs that button or that menu item. And so, that's number one.  If you constantly have to click here, push that button, choose that menu item, push it, push that, just copy+paste that into a file, you have one play button that you can run it. And we take it from there. And you learn more and more about Python and how it works.  But I try to keep the learning how it works to a minimum and really focusing, "Okay. If you want to achieve this goal, what are the things that you need to know for it?" And I've heard good things about it. People are happy with it. [00:46:23] JN: Excellent. I still can't get over that. I'm now going to want that in every GUI application ever. That's absolutely - I guess that's just powered by the Python API. And it's just knowing what the function invocation for each button is. That's just an intense feature.  [00:46:37] SS: That goes a bit into what is the architecture of Blender. And there are a few different layers in there. At the very basic, it is what we call DNA. It's the data structure of Blender. And every object, every mesh, every armature is a data block. It's like just block of data that is described by its properties and the type of those properties.  And the DNA system is something that Ton actually designed already in the very first versions of Blender. And that is still in there. And so, because the way it's built, it allows us to still open Blender 1.0 files that are like 30 years old and still open them in Blender today and get some reasonable output.  Of course, the render engine that was there at the time doesn't exist anymore. You have to redo some things. But as long as we can bring it to the next version, we always do. And basically, what it is on a technical level, it's just a memory dump of Blender, of Blender's memory. It just rides all those structs to the blend file with some meta information in there. Like, which struct type was it? That also contains which fields are there. It also stores where in memory that struct was stored. And that means that all the pointers can just be written as is. [00:47:59] JN: Right. Okay.  [00:48:00] SS: And there were pointers in memory. But now, because every data block has an annotation of this was its pointer basically, it becomes an identifier for other data blocks in that blend file. That is why saving in Blender is super-fast because there is no processing. There's no converting into other data structures. It just writes everything as is to disk.  And then when loading, it basically has to load everything once and then it knows which pointers - where those data blocks ended up in memory now. It can go over all the pointers and just replace the old one with a new one and it's done. That is a bit about how Blender loads and saves blend files. There is more detail in there. It also saves which version of Blender that file was saved with.  And so, there's lots and lots of versioning code that says, "If the version was older than this, then change this." And that means that if Blender changed from one representation to another, it can do that translation while loading the file and then it's taken to the next thing. Or maybe we found that certain properties should never be used together. And so, if that one's on, that one should be off. We can do that in versioning as well. All kinds of different things. And that allows us to just keep backward compatibility with older files. That's the DNA structure. And that is what meshes are stored in and what objects are stored in, et cetera.  Then on top of that, you have the RNA system. Both are from genetics. Somebody who knows about biology may not agree with the terminology. But it is what it is. And the RNA system provides more - basically, the Python interface on top of the DNA structures. It can also reorganize things a little bit so it can make some fake data structures that aren't really there in DNA but makes the Python interface more Pythonic.  And also, this system is used by the animation system. When you modify a property over time, that creates what we call an F-curve. Because it's basically a function that maps from point in time to value of that property. That's where the F comes from. And this F-curve says this is my path, a.b.c.d.e. This is my ray index. Number one. And that could mean that the path is pose.bonesleftarm.rotationoiler. And then first index. And so, that's number zero. And then you have the X rotation of that particular bone that is being targeted by that F-curve. And that, again, uses the RNA system to address that.  That also gives us a freedom to rename some things in DNA while keeping the RNA system the same. And so, it doesn't break your animation. And so, it gives us the freedom to do things.  And then you have a third layer which is the operators. And the operators basically - now to go back to RNA for a little bit. RNA can not just access properties but also expose functions. All kinds of operations on a data block, you can model as functions in RNA. And that then maps to functions in C++ that get called with red parameters.  But there's also operators in Blender. And these operators can be defined in Python but can also be defined in C++. And in the end, they register into the same registry. And they're basically functions that get called in specific places and specific ways that maps to certain methods on a class that get called in the Python world.  And this define when is it operation valid? When can you actually call this? And when can't you? What is the label? What is the category it belongs to? And then that can be hooked up to panels, and menus and all that kind of stuff. And so, these operators are the glue between functionality in C, or C++, or Python and user interface. Menus, hot keys also. They also map to just operator calls. Buttons in the user interface, again, just map to operator calls.  And that is what you actually copy+paste when you're doing the copy+paste trick. You press control+C on a button and that copies the Python code that is needed to invoke that operator. And that is roughly the data and operations part of Blender sketch tab.  [00:52:44] JN: Yeah, that's super interesting. I mean, I just said maybe some PhD biologists will get mad at the DNA, RNA usage. But when you were describing what the RNA system data, I was like, "Oh, that's such a good metaphor. That's great. That's awesome." Super interesting.  To stick to architecture for a little bit. I guess the other architectural question I had. We mentioned that you work on the animation and rigging module. And I know there's this idea of there's core Blender and then there's features in modules. What's the setup there architecture-wise?  [00:53:16] SS: It's getting better. Blender, really, in January 2nd or 4th, I think 2nd of January, it's the 30th birthday of Blender.  [00:53:24] JN: Wow. Congratulations, first of all.  [00:53:27] SS: Thank you very much. Thank you very much. It is still the same codebase. It is really interesting to sometimes fix a bug that might actually have been there since unknown, long ago. I think the oldest commit that we have in Git right now is 18 years ago. Before that, versioning was different. Apparently, different enough that we couldn't bring that into the sequence of versioning systems that Blender has been in over the years.  But you can imagine that things change. And when teams grow, organization of Blender source code has to follow suit. Currently, there is some distinction. Though Grease Pencil module, for example, really has their own files. Because those are the Grease Pencil files. In the history of Blender, Grease Pencil is relatively new. That really has their own corner. Same for geometry nodes. That has its own corner in the source code.  Animation ringing, we're creating that. But it's still spread over a whole bunch of other files as well. Still within those subdirectories, there is a directory about animation with all the animation tools and editors. But the functionality is still scattered around pretty wildly throughout Blender. We're taming that while we add new features.  Mostly it's about features in Blender or areas of Blender that relate to each other. Animation and rigging is there. VFX and the video sequence editor I think are one thing. Eevee and Viewport I think it's also combined into one module. And so, every module has one module owner who can make decisions and who is responsible for what's going on in that module, which changes land and which are rejected, that kind of thing.  And so, yeah, with the animation module, I became the module owner of that back in 2018. Then we were working on Blender 2.80 to get that out. And that was a very, very big amount of work. But it had to be done. And so, for three months, we got a lot of Blender developers from all over the world into Blender HQ to work on the code quest to really like get things done. And that was also where we looked at like changing some of the modules, re-assigning ownership.  And then there was a question like who wants to be a module owner of animation? Back then, it was just the animation module. And it was said that it's not that much time. And you just have to approve and patch every once in a while. And that's it. Yeah, I can do that.  And then it became quite successful. There was a lot of patches coming in. There was a lot of backlog because there hasn't been somebody really focused on just animation. One of the first feedbacks I got from people was the rigorous feel left out. Yeah. Okay. It's read name now to animation rigging module.  And we've been picking up the slack and people were super happy to see that, finally, there was more motion going on and more momentum in the module. And I think now it's one of the more active modules in Blender. And we have meetings every week. And for one-hour meetings to just get to show each other what we've been working on, but also discuss design decisions to get feedback from all kinds of different animators and riggers. Not just from people who are in the studio, but to have like a broader community where we can gather feedback from. And it really helps the developers to move forward. Because you could just show, "Well, I'm thinking about this. But maybe it should be like that." And then you just can answer. A or B? Okay. We go for that. And then you move on. And it really helps us to fly. [00:57:09] JN: Yeah. I mean, that was going to be one of my questions. This huge open-source project. 18 years of commits in Git. I think I saw that you had 140,000 commits to Blender currently for 4.0. And you've got internal developers like yourself at the institute and then fuss contributors. How does that work get organized? How do you - when you're building up to a big release or you know that you've got a next version of animation dropping, how do you prioritize work between what the community is working on and what they care about with what your internal road map? [00:57:41] SS: Well, for one, we try to keep the road map singular. There's no internal road map of Blender HQ and then an external community road map. On the other hand of that, there is - when there's a community developer who doesn't get paid for their time, you can't tell them what to do. They work on what they find interesting. And it's very personal what they find interesting.  And so, you have to deal with that as well, which can be quite a bit of a challenge. Somebody dropping something nice on your lap. But that doesn't fit in like where Blender is going. You have to see what you deal with that. That's a bit of a challenge but it can also be very inspiring and very cool to see that what people are working on and what they're enthusiastic about.  We have a bunch of developers who work just part-time in their free time on Blender because they like it, because they want to help out. And it's fun to do in the evenings. We have some developers who work on a development grant. The development fund is managing by the foundation. When people - the people can apply for a grant. And then that is really for a set amount of time to work on a fixed amount of functionality. It's really like, "Okay. You're going to work for these many hours a week on this particular project with that as outcome. And then we can give you this." And that works quite well.  One of our developers in our module, he is working part-time for a studio. And he is working part-time for us on our development grant. He liked it so much that he, instead of working two days for one, two days for the other and one day in between having off, he's now working three days for us and sacrifice his day off for us. He is doing amazing work.  And then there is people who are actually on payroll. And they're like employees of Blender Institute. Or, technically, employees sometimes of a payroll company, I think. I think we have a payroll company in Germany, for example, to work with a number of German developers. But that is more like regular being an employee of a company. [00:59:49] JN: Awesome. I know we're coming close to time. I have a couple of I guess quick-fire questions for you. We've mentioned older versions of Blender a couple of times here today. But at the time of recording, Blender 4.0 has like literally just landed. Are there any features in 4.0 that you're particularly excited to see?  [01:00:07] SS: A lot. It was actually really fun to make the presentation for last year's Blender Conference. That was end of October. 4.0 was about to be released. And it was amazing looking back at the past year of the animation and rigging module work. To see everything that we've done. It's hard to pick out one or two things. But there's so many improvements that were made. Yeah. It's hard to tell.  Also, for most of the world, Blender 4.0 release is - it's a big thing. And of course, for us as well. But it's like such a continual stream of development that it's hard to remember what was in which version exactly.  [01:00:49] JN: No. That's perfect. We'll drop the release notes in the show notes either way. Do you have any tips for folks who want to get into contributing to Blender who have experienced or coming out completely fresh? Never looked at a line of C++ in their life. What should they be looking out for?  [01:01:04] SS: Yeah. If you don't know any C++, I would say, well, either learn it if you want. But that's a decision that you have to make for yourself. Do I want to learn this? And then probably is easier to just work on a few different C++ projects for yourself. Because Blender is complex. And it is a mixture of C and C++. We're moving more and more code to C++. But most of that move means renaming the file from .C to .CC and making sure that things don't break when they're compiled by a C++ compiler instead of a C compiler. But many of the structures are still very C-like. It's a hard - if you don't know any C or C++ development, it can be very - how do you say? Hard project to get into. It can be very intimidating. Yeah.  Most important thing I think is people. Connect with people. There's blender.chat. There's a channel called Blender Coders. That is the main hub of communication of Blender coding. If you have questions about like how does this work? What does this function do? You can just ask there. There's also technical documentation on the wiki, wiki.blender.org. We're moving that soonish to a new platform that will make things even clearer and easier to find. But that will be like in a blog post somewhere when that happens.  But I think the most important bit is just connect with people. And every module has their own channel on Blender chat. If you're interested in certain topics, you don't have to dive into understanding all of Blender. You can't just go into a channel that is dedicated to that topic and talk with people about what you can do. And I think that is regardless of whether you want to do the Python side of things or the C, C++ side of things. Connecting with the people that can help you out, I think it's the most important part.  [01:03:02] JN: Awesome. And then speaking of connect with people, are there any Blender creators or animators in your community that you've got your eye on or would recommend to folks wanting to see what people are doing with Blender?  [01:03:12] SS: Oh, yeah. There's a lot. There's a lot. What I would say is, because there are so many, start by looking at the Blender Conference. Blender Conference last year was amazing. And we have presentations from people who worked on Spider-Verse, for example, on different areas. We had a presentation of a guy who talks about doing a whole short film by yourself as a single-person studio. There's always fantastic stuff in all areas of Blender. It's always mind-blowing.  I would say look at - on YouTube, there's the Blender Channel. There's the Blender Conference 2023 playlist in there. And that has all the talks that were recorded. If you want to have a lookout for people to follow, I think that's a very good start. Also, not all of them are on social media. Or some here. Or some there. It's hard to point at any one person. And I'd rather just point at this.  [01:04:04] JN: That works great. We'd love to put a huge conference playlist in the show notes. That's perfect. Well, Sybren, thank you so much. This has been wonderful. Thank you so much for joining us today. [01:04:12] SS: Thank you for having me. It was a pleasure. [END]