EPISODE 1882 [INTRODUCTION] [0:00:01] ANNOUNCER: Blender Studio is the creative arm of the Blender Foundation, and it's dedicated to producing films, games, and other projects that showcase the full potential of Blender. The Studio functions as both an art and technology lab, and pushes the boundaries of 3D animation through open productions. All of their assets, production files, and workflows are shared publicly, which gives artists and developers valuable resources to learn from and build upon. Most recently, Blender Studio released its second game, DOGWALK, where the playable character is a dog exploring snowy winter woods with a child. The project was built entirely with open-source tools, including Blender, the Godot engine, Krita for concept art, Kitsu for project management, and Linux. Simon Thommes is a Lead Technical Artist at Blender Studio and a developer on DOGWALK. He joins the podcast with Joe Nash to talk about Blender Studio, the process behind building DOGWALK, and developing a pipeline between Blender and Godot. Joe Nash is a developer, educator, and award-winning community builder, who has worked at companies including GitHub, Twilio, Unity, and PayPal. Joe got a start in software development by creating mods and running servers for Gary's Mod, and game development remains his favorite way to experience and explore new technologies and concepts. [INTERVIEW] [0:01:38] JN: Simon, welcome to the show. [0:01:40] ST: Thank you. Thank you. [0:01:41] JN: I want to ask you what your role is and what you do, but I guess to set the context for that, for folks who aren't aware, can you run us through what Blender Studio is and how it sits and is different from Blender itself? [0:01:52] ST: Yeah, it's probably a good idea. A lot of people that we talk to are not actually aware that Blender is also doing its own content creation. Yeah, that's done in the form of the Blender Studio. The Blender Studio has been existing for almost 20 years, I think, now, for creating actual open projects with Blender, making short films mainly. Recently, we've also done a game. There's been a game before. But the whole concept of the Blender Studio is to support the development of the Blender software by actually testing out the features that are being developed as they're being developed. We're always on the latest main branch of the software, not waiting for releases, so we're immediately testing the features as they make it into Blender. At the same time, also pushing the development into different directions. The idea is also to focus on various different art styles and just general themes with the project that we're doing to make sure that Blender is usable in all sorts of different branches. At the same time, we're also using a full open-source pipeline for our project. Of course, Blender being open source is our main tool. But for anything else that we do, like concept art, we use Krita. For our production management, we use Kitsu, which is also open source. We're all on Linux. The idea is really to push open-source content creation beyond what just only Blender does, and make sure that that's possible, and then fill the gaps wherever we need to with our own tools. While we're doing that, we're also sharing everything that we do openly. In the same spirit of open-source software development for Blender, we're sharing the content that we create, like the short films under a CC BY license, so people can also use it for just the video, for testing. There's people doing rescores, uploading them on YouTube, things like that. Also, the tools that we create around Blender for the production management and everything we share openly. That's a little bit of a rundown of what we do with the Blender Studio. [0:03:54] JN: Yeah, it raises a lot of questions. I guess, first one, before we get to in the weeds on this, I would give you a chance to talk about what you particularly do, but one question of that, so you said you're always working on that latest main. I imagine with that goal of trying to make sure you're always testing the latest stuff, that puts - how many projects you're doing a year, basically? I guess, it must be quite - they have to be quite short to keep that goal relevant, right? [0:04:14] ST: Yeah, so it really varies. We've been doing some longer projects. I think on average, it's been about a year for one project, because we're a relatively small team, and we have long cycles of preparing the production, doing the production, and then we also do other stuff in between. We also create educational content and blog posts and stuff like that to actually share that knowledge. Because it's not just the content creation that takes up a lot of time, but sharing that is also something that you need to actually actively put time into. It's been more of like, one project a year, and then some stuff in between. More recently, we've been trying to speed that up and have a shorter iteration cycle to be able to also, like you said, try out different things, be very much on top of what Blender is doing, because if it's a longer cycle, like one year, that is difficult to make sure that it's aligned with the Blender development in a way that it actually makes sense. Some smaller experimental stuff is a lot easier to handle in that sense. We've been trying to, this year, do four projects. That cut down into three and a half, maybe, but we've already shipped one of them, which is DOGWALK. One of them is almost done, which is a character asset that people can use a full production-ready rig, that animators can use for their reels and stuff like that, studios can learn from. Currently, we're working on another short film called Singularity, and upcoming is another short film that we already produced, a little reel for that is still in development. [0:05:39] JN: Nice. [0:05:40] ST: Yeah, we've been trying to speed it up quite a bit. [0:05:42] JN: Yeah, but even from one to four is a lot of projects. It's interesting. I was familiar with Blender Studio short films. It's really cool. DOGWALK was really cool to see, but also hearing about the character rig, that's really interesting. Well, I guess that's a good segue into your role. What is a technical artist? What is it you do? [0:05:59] ST: Well, I think technical artist is a role that's not super defined. It really depends on the branch that you're working in. It's just that little spot in between being a programmer working on the software and an artist using it. It's like a little bit of both. Mainly, I just try to find technical solutions to the creative problems that we're facing. What I do a lot is just shading, like writing shaders for the assets that we're creating, and tools for the other artists in our team also to work. A lot of it is geometry nodes work nowadays. Yeah, with the Blender Studio being very tightly connected with Blender for the software development, I'm also - or not just me, but our whole team is also pretty involved with that. In that sense, I'm quite involved with the geometry nodes development team as a stakeholder artist, basically, helping them design the features and test everything before it makes it into main. [0:06:52] JN: Awesome. That makes sense. A lot of you found a geometry node problem, whilst using it for a project. Now you have to help fix it, kind of situation. [0:07:00] ST: Yeah. I mean, it comes with downsides and upsides, right? A lot of the time, it's very nice being able to face a problem in production and then talk to the people that work on the features directly and help come up with a solution that integrates well into Blender for everybody to use, but fixes our problem. Yeah, there's been a large part of the philosophy of the Blender Studio being so connected to Blender development. If we fix a problem for ourselves, it's probably also going to help other people out. It's easier to make a software for us, because we know what we want. It's probably still going to be useful for others, rather than only focusing, okay, let's make the best possible software for everyone. Of course, we're trying to keep that in mind. It makes it a lot easier to have a concrete example of the Blender Studio as a use case. [0:07:46] JN: Right. With that in mind, what was the problem you were trying to fix with DOGWALK? [0:07:51] ST: Yeah, so with DOGWALK, for us, it's been really a big change of pace being a game, since we're very much used to movie productions. It was mainly just that we are very much aware that the game development community is using Blender quite a lot, more so maybe even than the movie industry, because it's already disconnected between the DCC and the engine, where most people wouldn't use their actual game engine to create the assets. There's a difference anyways, so might as well use Blender in your pipeline. It's a lot easier to interconnect than maybe in the movie industry where people might already been using Maya for animation, so it integrates well with Autodesk software, and then it's harder to shoehorn Blender in there. We're very much aware of that being the case, and don't have that much experience ourselves with it. Like we do for movie creation, we wanted to also have some more real-life experience of how the process works for game development with Blender, or generally with open source. It was definitely something that was interesting to us to just get the experience and just connect, basically, with the interoperability aspect of how can we make Blender more easy to hook up for exchange of creating assets and then using them in a game. [0:09:13] JN: Cool. I guess, before I follow up on that and ask it, so as you do that, I guess it'd be useful to talk about what DOGWALK is as a game, what the gameplay is, that kind of thing. [0:09:22] ST: Yeah, so DOGWALK is a little micro game we call it, because it's very small. Can play through it in 20 minutes. But it's a little interactive experience where you walk around in a winter landscape building a snowman. As a dog, you're actually playing the dog, being led by, or maybe leading a little kid through that landscape and then you need to decorate the snowman, find some items around the world and need to get along, so there's really a relationship aspect to this game as well. [0:09:53] JN: Yeah, absolutely. The leading bit. The first time I realized I could hurt the kid by pulling too hard, so I was like, "Oh, no." [0:10:00] ST: Yeah. No, that was definitely intentional to create these moments where people maybe accidentally at first, or just find out how you really affect the emotion of the kid running around with your behavior. That's one of the main core aspects. Because in terms of game mechanics, there's not a lot. It's really just movement mechanics. You're just walking around and that's it. Then there's obstacles and different environments and you need to lead the child around the landscape, without making it cry so much that it doesn't want to move anymore. [0:10:32] JN: Yeah, it's very charming. I said, the 20-minute length, super playable. I highly recommend just go grab it from Steam and give it a go if you're listening to this. It's really fun. Also, I liked the way you worked in Blender's Dutch roots by calling the dog Chocomel. Very good. I appreciated that. [0:10:45] ST: I appreciate you noticing. [0:10:49] JN: That's the game and it is, I think, a really, you called it a micro game, but it has a lot of obviously, which makes sense, given the original team a lot of love and attention to the art and the art style. With that art style, I got the vibe that you were going for a crafted, almost, I'm sure, I may comment it's like Yoshi's Island, looking like craft paper, kind of thing. What led to the decision to that art style? You said that you tend to experiment with different art styles to drive Blender development. Was that something there? [0:11:13] ST: Yeah. Definitely in general. For this, it was actually a bit different, because the final implementation of the game obviously would not be in Blender, but it was in Godot. It's not really about pushing the art style capabilities of Blender as much. It was more about the interoperability, really. On a technical level, the question of the art style was more, how can we get something that we can work mainly in Blender and don't really need to focus too much on getting that same look in Godot, because we wanted to really focus on other aspects. We just wanted to focus on the pipeline for getting assets into Godot. Shaders, replicating a complicated shader setup, for example, that was a few steps too far for that goal, because it was really just a small project, so we couldn't really focus on everything. Then the logical conclusion was to try and go for an art style that allows us to basically use the built-in interoperability features that we already have. We went with glTF as the exchange format. There's many standards just for PBR workflows, right? It's just physically-based rendering. You just have some color map, roughness map, and normal maps and such. Then it just works for anything that is supposed to look photorealistic. For anything else, that's really stylized, you might need to do something very custom, and we didn't really want to do that. The idea was to focus on a PBR workflow, but still stylize it on top of that. Papercraft is a natural answer to that, because it is physically based. It's supposed to look realistic in the sense of the surface quality and everything, the lighting. At its core, it's still very stylized. It's purely art, basically. It's all crafted in a way. That was the logical conclusion on a technical level for us. Also, our director on the game, Vivien Lulkowski, she's very excited always when it comes to doing things at a practical table, not just in the digital space. It was also very fun to see these things come to live in real life and then recreating them in 3D. That workflow worked out perfectly, because that meant, also for us, we could focus with the creation of these assets mainly on the interoperability part. It was relatively straightforward to just build the assets themselves. It was really for us trying to put the focus where we wanted to and then see what results from that. [0:13:42] JN: Very interesting. Good incentive to finish the game as well. If you want to see any of those physical models, they are in the credits. [0:13:47] ST: Yes. [0:13:48] JN: Do roll through. In terms of the pipeline itself, so the team has said some really interesting things, some of your dev logs about what you're trying to achieve with the pipeline, some things that stood out where you wanted it to be Blender-driven as much possible and to do as little as possible in Godot, which is really interesting. The other one was about glTF and why you chose glTF. But let's start with the first one. Actually, I guess at a high level, what did you want this pipeline to look like? As an artist, what did you want the flow to be? [0:14:14] ST: Basically, the idea was that, like you mentioned, we could do as much as possible in Blender, because like I mentioned before, we're very much coming from just using Blender for all of our content creation and having people learn a new software would be, I mean, it would be possible, of course. But also, it's not really in the interest of the people that are supporting us. To some degree, of course, we want to also support the Godot development. That's not the point. But we wanted to with the funding that we get from people that want to support Blender, keep as much also from that and Blender, while we are still providing a use case for Godot and giving them feedback for their development. Yeah, it's a little bit about the skill set of the people that come from the background of movie production, but also about just keeping our focus strictly on Blender. The idea pipeline for us was to just basically use exactly what we've been doing so far for asset creation, which is really relying on the data structures that we have in Blender. Having everything set up with collections as a unit of an asset and then objects inside there that contain all the data, the geometry of the different data layers, and then being able to nest them inside each other. A tree would have multiple branches, but then another tree could use those same branches to basically make it easier for ourselves to create those assets and not duplicate the data all over the place. It'd be able to nest them inside each other. Each branch would be an asset, and those are used in all the trees, which are all individual assets. Those are finally used in the set. The set itself is also an asset. Nesting was very much at the core of what we're trying to go for, because that's something that we used to from working in Blender. Then the idea was that we really just have all the assets, sets the characters, everything set up in Blender like we normally do. Then, instead of linking everything into animation files, which is what we usually do as a direct reference, export it to a glTF and then push it to Godot. On the Blender side, everything was very much the same as we usually have it, just with nested assets that are just instancing data around. Then the pipeline would take care of the rest by recreating all of that hierarchy on the Godot side. The artists that were working for the actual content creation of the assets on the blender end only really needed to touch Godot for validating that what they did actually looks the same. If the pipeline worked perfectly, they wouldn't really need to touch Godot at all, which obviously, it's very important for us to still take a look and validate everything. The people that were actually working with game code would mainly work on Godot. But the people creating the content would really be able to stick to just working in blender. [0:17:05] JN: Very cool. So, part of that workflow that I found interesting as a amateur Godot dabbler, who has been frustrated by collision meshes is that you're also trying to deal with the collision meshes in the Blender side, right? Did that work out? [0:17:16] ST: Yeah, I think that worked quite well. With the pipeline that we went for, we immediately saw that if we want to actually have this work, we need to extend the build functionality quite a bit with some custom things that we want to add. Once we have some setup in place to just have an extension, which is very easy to do with Blender for the glTF export, is just create an extension by the template that already exists. Then you can add a lot of functionality as hooks to just run code whenever something in the exporter is happening. It was very easy for us to just write metadata to the individual export nodes about the collision, for example. It's very custom, of course. It's not really super easy to replicate that, unless you really just use the exact same methods. For me, within a framework, it's what's very easy to create a geometry nodes node group, for example, that would create some of the primitives that we want to use for collisions, so the capsule sphere, whatever there exists in Godot, and then have a user interface on the Blender side for people to define the properties of that, like the size, the radius, and everything of the primitives for the collision. Then just in the name of the object, just mark it as this is supposed to be for collision. Then the script could very easily just read out all those properties, write them as metadata, attach them to the node on export, and all that would happen automatically. Then on import, we would just use all of that metadata to just recreate the exact settings. It really have this mirroring setup on Blender, on Godot side to give us the same functionality to communicate to the artist working on it. Basically, this is what you're going to get. Then make sure with the pipeline that that's actually happening by recreating the whole setup using the metadata. It was really quite easy for us to set up. A nice user interface also on the Blender side and write that data in a way that it can be used on export. [0:19:12] JN: That's awesome. That's very cool. Which, I guess, brings me to the goal of this project was to experiment this pipeline and see how the pipeline between these two pieces of software was, and drive the development of that. How much of what you ended up with was additional tooling on top of Blender and Godot, and how much is things that we might see appear in a later version of Blender, if that makes sense? [0:19:32] ST: Yeah. I think most of the features that came out of this are already in the release of 4.5 that's out. That was mainly bug fixes, some smaller features for the glTF exporter, some bigger ones, also on the geometry node side, some reworks on the API to make it easier to export geometry data. That's hopefully still coming. That's still in the works. There was a big chunk that was also just custom code on top, which hopefully, we can still use to communicate also with the Godot developers to just see, okay, what of that can we - how can we tackle those shortcomings, basically. To some degree, it's also that Khronos Group for developing the glTF exchange format, they have some extensions to glTF that we haven't really used, because to be honest, at the time when we were developing the pipeline, I wasn't fully aware of those, or some of them I saw, but they seemed very much work in progress, so I wasn't sure if you could rely on them. At that point, it was easier for us to just do a custom thing, our own that we know works, that we have full control over. But there's some extensions that they're working on. For example, for complex scenes, the whole nesting concept that I talked about earlier that we did in a custom way, there's a concept for that with glTF as well with that extension. I think now it's a little bit of a matter of trying to assemble the pieces to really bring those features to the regular user that doesn't necessarily want to write their own custom pipeline, because that's something that we still need to do, but that's definitely planned. For now, it was, like with the project itself, because it was a relatively small scope, we did the whole thing in four months. Doing that with some development on the side, and then also polishing everything is not really feasible. For now, we just focused on getting all the pieces and then assembling them later. Yeah. Also planning to do a workshop with some Godot developers and some interested people to actually make that happen. Yeah, that's going to drag a little bit more, unfortunately, as things go, because we also have some other projects that we're working on. But yeah, we definitely still want to make that happen. [0:21:42] JN: That's awesome. Yeah. Although, I know in the post, you've just done it there as well disclaim that it's not production ready, and you know it's very beta. But in the post as well, you did release the source of the pipeline that you have currently. Again, with a lot of disclaimers, but if folks are interested, you can go poke that around, look up the DOGWALK wrap-up post. [0:21:57] ST: Yeah, we definitely have the entire repository with the project actually uploaded. The game itself, the code, including also the pipeline is all online for people to dabble around. [0:22:06] JN: Perfect. One thing I think is really interesting. You just said that you were thinking about the DX of this from the perspective of an individual user of a program, but one of the special things about Blender Studio is you're doing this at a production scale. It's not a large-ish team working together. That, I saw had some really interesting implications for your selection of glTF as the interchange format over the blend files. Can you talk to us a bit about that? [0:22:29] ST: Oh, over blend files. Well, as far as I know, on the Godot side, from what I understood, sorry, the blend file itself directly as an exchange format is under the hood still using glTF for the exchange. In terms of the feature set, it would have been more or less the same, what's supported out of the box. For additional customizability, it was a lot easier for us to make that step explicit, because if we just have the blend file itself as the thing that's imported for the game files, it would still generate that ephemeral data of a glTF under the hood, but we would have no control over it, basically. The way that we did it now is more or less the same, because it's still using glTF, but we actually have control over the data that we output. Hooking up to that export step by having an explicit export step in the first place was really valuable for us in that sense. Something that we also generally have been walking towards in terms of our workflow is to have an explicit publishing step. Rather than having all of the working data that you might be working progress working on, having an explicit publishing step that the artist can control to say, "Okay, this is done. This can go into production," and then pressing the button, that's not something that we see as negative at all, it's something that's actually very positive by creating that in your workflow. Having the blend files as direct input to Godot wasn't actually very useful for us, because we would make a custom pipeline anyways. I think for people, individuals, for example, that are working on their own, it might still be very useful to have that, just because it cuts out one element of their pipeline that might not be necessary. For us, as a studio working on this, it meant giving us more control, which was actually very valuable. [0:24:18] JN: Yeah, that's really interesting. I guess, while we're talking about as a studio working on things, you're used to working on shorts and motion picture and stuff. How was it working on a game? Was that requiring differences in how you worked as a team, or your workflows? [0:24:32] ST: It was quite different. I think it very much depends on who you ask. For the whole team, it was quite different. But some of the people working on it, in terms of pipeline, we tried to keep it as much the same as it was, right? Like I mentioned before, the people that were creating the assets were still working in Blender for 99% of the time. In that sense, would try to keep it basically the same as it was. Of course, animating for a game is going to be very different than animating for a film, because you cannot rely on the camera being in a specific spot, or the angle that the character is facing. We also have a blog post over the difference there, but that gets very different animating for a game, also with a variable refresh rate, compared to a film where you always know, okay, it's 24 FPS. The camera is exactly this, the character is facing this way, and then you can cheat anything that you want in between. With the game, you can't. The player will see immediately when there's something cheated. In that sense, it was very different for the animators. For the asset creation, I think it was quite similar actually, because usually, assets we want to be able to see from all angles anyways. For me personally, it was very different, because I was in a completely different role than I was usually. I was mainly doing, or almost exclusively doing programming on this project, but I usually only do a little bit on the side. For this, I was actually implementing the pipeline, implementing shaders and stuff like that. Then at some point, halfway or so throughout the production, I could actually move over to help out programming on the game code itself, which was quite late. I would have hoped to move over to that a lot earlier, but the pipeline was proving to be more complicated to implement. [0:26:12] JN: Had you used Godot before this project? [0:26:13] ST: No. I think I'd opened it up before, but that was about it. I went to last year's Godot conference. At that point, I hadn't used it at all, but it was really interesting also to hear about the way that they do open-source development and how that is different to what we do at Blender. It was just a super fun event. Really good talks. Very cool that you can try out games that made people made in Godot just on the trade show. Yeah, that's basically been my experience with Godot before this, just hearing about it. [0:26:43] JN: What were some of those ways, the two projects managed differently? [0:26:46] ST: The way they do open-source development is maybe a bit more democratic, I would probably say, and more loose in terms of adding features, adding smaller features. The way I understand it is that they try to solve problems, if necessary, several different ways, with a slightly different flavor. While we at Blender try to keep things a bit more streamlined in that sense, but that makes them a lot more agile, because you can just, on the fly, fix a problem, and then it's fixed, and there might be a different way to solve it slightly differently, but that's fine. I think you can argue about what's the better approach here, and what's more scalable in the long run, but it was definitely very interesting to me to see how that works. I think most of their code, or yeah, also their code is actually written by contributors, and then just reviewed by the people that are actually hired by the foundation. I think at Blender, it's the opposite way around, where most of the code is actually written by people that are on payroll by Blender, hired by the Blender Institute. Yeah, of course, there's still contributions from very important contributions from community members, but in terms of just the quantity, I think it's the other way around for us. [0:27:57] JN: Right, right, right. See, I guess come back to your first use of Godot. I guess, you answered this in the intro, but was there anything that drove the choice of Godot here, aside from it being the open-source engine of the moment? [0:28:08] ST: Yeah, it was an obvious choice for us. Well, the director of the game, the main person driving this idea here, Julien Kaspar, he has been just very interested in games in general. Most of us have been, but he's been trying to just play around with game development on the side. He doesn't really come from a strong programming background before that, but he really tried to lean and deeply to figure out how to make these things happen, because he was really excited about the project. He had been trying out Godot on the side, and he had been really excited about it. That's mainly where the choice came from, I think. But like you mentioned, it is the open-source game engine of the time, I suppose. Personally, I've been also very excited about it. Jumping on it, when I actually started using it, it felt very familiar. It felt very easy to get around. I was surprised how quickly that would work out, especially Python being something that's very familiar to me, also from Blender, add on development as a basis for the GD script. That was very nice to jump onto. Of course, this used to be a Blender game engine, which I think in the aftermath of releasing this game, there's been lots of talks from people that Blender should bring it back. [0:29:23] JN: There is a fork of that that exists and is ongoing, right? [0:29:26] ST: Exactly. That was what I was going to bring up. There is a fork. But yeah, to be honest, we didn't really consider it that much as a viable option. We'd have to actually look into it a bit more again to see how they're doing with the development of that. Yeah, in terms of what people are using, it's a lot more representative to use Godot, because, yeah, like I mentioned before, we were really trying to figure out the current landscape of open-source game development to see what people are currently doing and where the shortcomings of that might be. Yeah, looking at the, I think, UPBGE is the name of the fork for the Blender game engine. Would have not really been super representative of that. [0:30:05] JN: Yeah, and I also imagine that would probably cause the discourse about Blender should do its game engine again even more if you did it in the fork. [0:30:11] ST: Oh, definitely. Yeah. [0:30:13] JN: We're talking quite a bit about things that have been going on on the Blender side, but I guess, on the gameplay side, as you were getting into Godot for the first time, was there anything that you were surprised by Godot, or features that were missing for what you were trying to do? [0:30:28] ST: Personally, also not coming from a game development background, just to put that as a disclaimer. I don't have a lot of experience, or presumptions, I guess, coming into this. I was a blank slate in that sense. I've been really surprised how extensible Godot is also for tooling. That's actually something that I also heard about a lot before in, for example, the Godot conference when I was there, how writing tools for Godot for the actual development is the same thing as writing the game code itself, which is absolutely true. It's a brilliant concept, in my opinion. I've been trying to think also, how we can replicate that kind of thing in Blender, which to me geometry nodes is a little bit similar, where you can use it directly for content creation itself and for tooling to create content with. It's a similar idea, which I really like that it does point of connection. In Godot, that's been really useful for me that you can really just write features, or tools, and it doesn't really matter. You can use the same code in either one. Yeah, we've been exclusively using GDScript, just because that's something that we were a lot more familiar with in terms of being so close to related to Python. [0:31:43] JN: I think that is definitely the happy path as well, as far as I understand it. [0:31:46] ST: Yeah. Yeah, we didn't even think about using anything else, to be honest. Yeah, from what I hear also, that it's very extensible with C++, for example. It was really interesting to me how extensible it really is, especially also with the signal functionality that you have, where you can really hook up functions that you want to write yourself to the engine itself, even to the point of the file system, for example. That was very useful actually for us. For me, writing the pipeline, I did need to use a lot of workarounds for implementing it the way that I wanted to by combining different functionalities in a way that I didn't expect I would have to. Godot made it possible by being so extensible. I could have my own functions hooked up to signals of the file system, for example. Whenever Godot would recognize that there's a new file in the game files, I could attach my own code and just do whatever I wanted to prepare the files for import, for example. That's something that we've been doing for this pipeline. Yeah, if I compare that to the extensibility of Blender, in Blender, I would have never been able to do this amount of customization. That's been really, really interesting for me. [0:32:56] JN: I guess, it's intuitive when you think about how it works. I didn't realize that the signal system extended to how the editor is working, which I guess makes sense, because the editor is written in Godot, but yeah, that's a really interesting just concept to hear about. That's really cool. [0:33:09] ST: Oh, exactly. It was also a moment that I didn't think about it first. I was trying to solve a problem and I just googled and then somebody suggested on some forum that to use a signal. I was like, "Huh." Then opened up a whole new world that I didn't know about before. Because by that time, I also didn't do any of the game development itself yet. I was still trying to find my way around the API in Godot. It does work quite differently than what I'm used to in Blender. It is a lot more extensible in that sense. I was really impressed with that. That's also something that I'd like to potentially bring more to Blender, to be able to hook up to native functionality of the software with your own code in an add-on. [0:33:46] JN: Cool. Well, that brings us close to the end of our time. I think it's a good place to wrap. Simon, thank you so much. This has been super interesting and wonderful to hear how this also just made in this case. Thank you so much for joining us today. [0:33:58] ST: Yeah, thank you. [END]