EPISODE 1878 [INTRODUCTION] [0:00:01] ANNOUNCER: The Stack Overflow Developer Survey is an annual survey conducted by Stack Overflow that gathers comprehensive insights from developers around the world. It offers a valuable snapshot of the global developer community, covering a wide range of topics, such as preferred programming languages, tools, and technologies. Jody Bailey is the Chief Product & Technology Officer at Stack Overflow, and Erin Yepis is a Research Manager at Stack Overflow. They joined the show with Sean Falconer to talk about the results of the 2025 Developer Survey, which was recently released. This episode is hosted by Sean Falconer. Check the show notes for more information on Sean's work and where to find him. [INTERVIEW] [0:00:56] SF: Erin and Jody, welcome to the show. [0:00:59] JB: Thank you. Great to be here, Sean. [0:01:01] SF: Yeah, and Erin, I guess, I should say, welcome back. This is now our annual podcasting event, where we discuss the Stack Overflow Developer Survey. I'm excited about that. [0:01:11] EY: Good. Yeah, let's go ahead and get this set up for 2026. Getting on the builds early. [0:01:16] SF: Exactly. I think we got a lot to cover. Before we get into it, I wanted you to help frame what the Stack Overflow Developer Survey is for those that are not familiar with it. Can you give a quick explanation of what it is and what you're trying to accomplish with that? [0:01:33] EY: Yeah. The Stack Overflow Developer Survey has existed for almost as long as Stack Overflow, the site has existed. This has been our 15th year of running it. It started as a very small thing. There's a lot of people on the site asking questions just to get that really clear, aggregated view of what technologies are you using? We can see what you're asking about, but it was just the most obvious question to ask that that community. Since then, it's grown into quite a bigger deal than that. It's just a unique opportunity every year to get that temperature check from actual developers, from them directly saying themselves, instead of a third- party inferring what it is that developers use and care about and do, and it means a lot to our community. [0:02:23] SF: What's the typical number of people who participate in the makeup of those developer profiles? [0:02:29] EY: For the most part, the core group is - one of the questions that we ask is, do you code for work? Majority of our users every year indicate that they are a developer by profession, so they're working as developers. Somewhere this year, it was 76% indicated that. But the survey is open to people that are students as well, or people that are adjacent in a way, like they're PMs, project managers, and stuff like that. Besides that, we have a huge audience, obviously, here in the US of all the countries that responded this year, last year, and three years ago. The US, Germany, India, UK are always in the top five. This year, about, yeah, 50% of all respondents came from those four countries in addition to France and Canada. We have a strong European, North American, an Indian developer group. As we saw this year, what we have been seeing trending lately, too, is that we have a lot of more experienced developers. We ask this every year, how many years have you been coding? 2025, 65% of the respondents indicated they've been coding for more than 10 years. [0:03:40] SF: Jody, did you have something you wanted to add about your perspective on why you do this and the value that it brings to the community? [0:03:47] JB: Yeah. I bring a slightly different perspective, because I was one of the people that was always looking at the dev survey as a dev manager, as a developer, even understanding what other people are doing, what are the trends, what can I learn? As a hiring manager, it was always important to understand what tools and technologies people were using. It was always a barometer. It was something I always looked forward to see what was new and exciting, what people were thinking, and was super helpful just in terms of identifying trends, as well as thinking about how I hire and create a place that people wanted to be a part of. [0:04:25] SF: Yeah. I was saying in the pre-show that I reference the survey all the time. I think that if you really want to start, sound smart in conversations, there's a lot of meat on the bone that you can leverage there for all kinds of different questions. People might ask like, how's Rust doing as a language? It's right there in the survey, that you can use sound smart quoting some statistics around that. [0:04:45] JB: Yeah. That's one of the things. A lot of the places I've worked, we've been polyglot, where there's lots of different tools and technologies, right? Everybody wants to use something new and cool. Rust, right? The survey would be a good reference of, okay, well, if we're going to consider this, what is our hiring tool, right? I mean, do we really want to have a tool that maybe we can't find enough developers to work in and end up with this code base after our Rust - not to pick on Rust, but fill in the blank language to ensure after that person that advocate leaves, then what do we do? The developer survey was super helpful in that way, too. [0:05:20] SF: Yeah. Is this just a moment in time trend that we should invest in this company? How does that impact us? Or is this something that's going to be everybody's moving this way? I think the survey results certainly can help with those kinds of decisions within the company. From your perspective, and I'd love to hear from both of you if you have different views on this, what is the biggest story coming out of this year's developer survey? Maybe we'll go back to you, Erin. [0:05:43] EY: Yeah. I think the biggest story, and I would not be alone in thinking this is the numbers we have around adoption, trust, and favorability. For the third year, we see more developers than ever are using AI tools. I should have mentioned that. Trust of AI tools. Not just of in general. [0:06:06] JB: I knew what you were talking about. [0:06:08] EY: Yeah. I referenced this so much, I'm like, you guys know what I'm talking about. Because it was a pretty big story. We see that for the third year in row, we have this growing adoption of AI tools. Now, over 80% of our respondents, 84% I believe, are using AI tools for work. The trust took a dramatic dive this year, compared to 2023 and 2024. Favorability did as well. It's interesting to see that, wow, so more people are using these tools, but they don't like them and they trust them less. To me, I think it makes sense as you have, one, there's just more AI tools being developed. You have a lot of things that you can try, a lot of things that could fit into your - put nuances of what your role is as a developer. Not just a one size fits all type of deal. I think we have these high expectations as well for certain developers. Of course, everyone knows when you have high expectations, there's more probability that you're going to be disappointed. I will say this that when we look at - specifically, we asked this year to use AI tools daily, weekly, less than weekly. What's the variability and use there? There is a correlation that we see for the developers that indicated they use them daily. They have higher favorability scores than those that use AI tools less than daily. I think it probably speaks to more of that learning curve like, "Hey, I just tried a couple of things. Some of it didn't meet my expectations. But if I keep up with it and find out how to make this work, then I can figure out my expectations are at the right place. I'm using the tool for the right thing that works for me." We also see with younger developers, developers that have less career experience, they have higher favorability and trust scores as well, which probably also just speaks to, "Hey, my bar was low already. I don't know what I don't know, because I just started doing this job." There's room to grow there as far as those two things. As I mentioned, we have a lot of very experienced developers that answer the developer survey. Of course, their expectations are going to be high. [0:08:14] SF: Yeah. The key is set low expectations and then you'll be blown away. [0:08:20] JB: It's like, when you go see a movie and everybody told you it was amazing, and you're like - [0:08:24] SF: Yeah, exactly. [0:08:25] JB: It's okay. [0:08:26] SF: Yeah. Nothing disappoints you, like high expectations. I would think, too, as part of this, we're becoming more comfortable with AI. It feels probably less magical, because people are using it day-to-day, so you're less blown away by it. Then also, I think if you're using it all the time, you naturally encounter the rough edges more often. The other thing I wonder, which could be a factor in terms of the decrease in trust and people enjoying these tools is that maybe in the early days, essentially, the collection of developers that were using them were more of the early adopters that were excited about this. Now a lot of companies are taking the stance that, hey, if you're going to develop in a company, you have to use this tool. We purchased it. We want the efficiency gains. Now, whether you want it or not, you're being forced to use it, which naturally also creates a negative reaction to it initially. [0:09:17] JB: Yeah. I think that's spot on. It's something I've seen not just survey wise, but just talking to peers and other people in the industry. When people, as you mentioned, are forced to use it, so to speak, especially a tool like this, where my observation and experience, it's not like you're perfect right out of the gate. I mean, you have to learn how to use the AI tools and how to break things down into manageable pieces. People are still experimenting as they're building things that - I talk to developers and it's like, well, I've got to figure out ahead of time, is it going to be more work debugging the code I generate, or just easier to write it myself, right? It's figuring out what that balance is. Going back to the high expectations and the dictation, right? Nobody likes to be told what to do, or how to do their job. Especially developers, right? I mean, we've always considered ourselves craftspeople, where it's as much art as science. Then to have somebody tell you, "Oh, well. Just generate a bunch of code and you'll be fine." That it discounts the value of what developers bring to their job. There's also these super high expectations that all of a sudden, we're going to have this massive increase in throughput. If you think about it, and I always encountered this with pair programming, or mob programming for my team, it's like, there's this assumption, especially by leadership, or people that aren't in the business that developers spend eight hours a day typing code, right? We all know that's not the case, right? If you think about the fact that developers spend 20% to 40% of their time writing code, even if you improve the throughput there, like cut it in half, it's still a small percentage increase in the throughput. That said, I think that's where there's real opportunity in terms of how to leverage AI. What I'm seeing with some of the thought leaders within my own team is identifying ways to reduce time on those things that aren't about writing code, right? The things that, whether it's collapsing meetings, or if it's about figuring out how to think about designs and break things down more effectively, you see, oftentimes, those types of things using AI are really beneficial. What I'm trying to do and encourage with my team is experiment with the different tools, find the ways that work best for you and help you, and not necessarily, hey, all code has to be written with an agent. [0:11:50] SF: Yeah. No, that's a really good point about the fact that engineer's time is not hands-on keyboard writing code all the time. You're only going to get so much efficiency gain there, even if the tool is as successful as possible. I think one area, too, that I'm seeing a lot of investment from companies, I can't remember if there was anything covered in the survey, I'm curious to hear your thoughts on is the bane of many teams' existence is on-call duty, and there's a lot of companies and investment on the AI front there of how do we alleviate some of the stress of being on-call, provide better tooling to solution those. A lot of times too, if you're an on-call engineer, you might be on call for systems that you're not necessarily working in all the time. You end up inevitably having to go to someone who's an expert in the company to ask them questions and then you're taking them away from their time that they can be thinking about designs, or executing whatever their job is. It ends up being a cascading effect of sucking up a lot of people's times, not just the person on-call, it's all these other people that might be involved in it. [0:12:52] EY: Yeah. We didn't have a specific question about that use case, which I do think is interesting, yeah, that support aspect of things. I will say that we did ask about what are specific tasks, what are you using AI tools for? We created this visualization, where it's superimposed the tasks that people said they're mostly using AI tools for and the satisfaction that they have with AI tools. We can see that for the most part, the people that use AI tools a lot are using it for searching for answers, which I think would be, when you can't find the experts, or the expert is not available, that's the use case. If it's helping them, they are more satisfied with that use case, too, of all of the things that you can be doing with AI tools right now. [0:13:42] JB: Not necessarily related to the survey, but just my own experience and working with the team is using AI to explain code and break things down has been really useful. You can imagine in that use case. I see my quality engineers doing this to reverse engineer code, or as we're looking to break down really complex pieces of code, using AI in that way is really helpful. You can picture that as well from a support perspective, because you're right, it's like, you have one person and you can't expect them to know the whole system. How can you help break it down, make it easier to understand? I think there's real opportunity there. We see our infrastructure teams focused on improving those aspects. [0:14:21] SF: That applies to new people on the team as well, or even junior developers, too, just helping them grasp where things are, or how the code's structured, having a psychologically safe zone to ask an AI automation is, it might be in favor to having to ask somebody who's on the team. In terms of AI agents, there's a tremendous amount of hype right now around AI agents. There's a lot of stats out there that indicate this could really be the next frontier software, but the survey shows that adoption is still relatively low. What do you think is holding developers back there? [0:14:59] EY: Yeah. We had a couple of questions where we asked about in your own words, and for AI agents, we had this too. We asked about, what is the impact for those that have adopted it? Backwards engineering that, what we see for those that have adopted it is that they were able to see productivity gains. They were able to learn new things faster and upskill, reskill. I think where the adoption is lacking is that whoever is in charge of either purchasing, or making the decision, they're not able to connect the, this is a possibility. What does it mean for in time and money? What's the ROI on that? There's people that have already taken the leap and they're able to show like, "Hey, my experience as a developer, this has helped me do this and this. It has saved me time. It has helped me learn." It's just like that. I guess, because we see generally surface level a lot of feedback about AI, it's hard to connect that to the specific agents. Should I invest in that? Is that worth the time of standing it up, debugging a couple of things with integrations, get it running on a couple of local machines, having the team all in on it. We're all on the same page. Yeah, it's going to take some time to invest. We're seeing more of the good parts. We see that in the dev survey, too, is that they have said, "I use this." For the people that have used it, yeah, I have seen these impacts. [0:16:32] JB: Yeah. One of the challenges and it goes back to the hype, right, is the hype is way out in front of the actual engineering and the learning curve. It's easy to forget, or maybe not how fast things are changing, right? Especially as engineers, we're still just learning how everything works. I mean, MCP as a protocol is really new, but it's really powerful, and it's changing the way people think. Building agents is - [0:16:59] SF: About a year old. [0:17:00] JB: Right? But it feels old already, right? It reminds me of, and you probably remember when JavaScript was taking off and felt like there was a new framework every week, right? It's like, you didn't know what to do, or how to do it. It's the same thing. You've got the hype and you've got all the people out on social media talking about, "Oh, it's doing this. I did this in five minutes," and they can do all that. Meanwhile, all the engineers are trying to deliver all the things they are already committed to, trying to learn new tools, what it can really do versus what people say it can do. Then how do you scale it, etc.? I'm seeing already, people are starting to adopt and build agents that didn't know what an agent was six months ago. [0:17:41] SF: Yeah. I think two years ago, no one knew what a vector was. Then suddenly, everything is - [0:17:47] JB: Right? It was a line with two points, right? From math and physics, right? [0:17:52] SF: Another trend that I think the survey talks about that also is something that we see a lot on social media and so forth besides AI agents is this concept of vibe coding. First of all, for those unfamiliar, can you explain what vibe coding is? Why do you feel it's resonating with some of the new developers that are out there? [0:18:12] EY: Yeah. Vibe coding is just a terminology and it's cauterized in Wikipedia. I feel like it has some sentiment associated with it. I wanted to make sure whenever I asked the question, I'm referring to this thing. This is a social phenomena. I'm not trying to put any shade on your work, or what you do for whatever, however you get your job done. It comes with a lot of, I feel like, very passionate emotions for some developers. Vibe coding is just, you're using these AI tools to tell you what to do when you may have zero to little knowledge about what it is that you're trying to accomplish. There's a lot of people that don't even have to touch a keyboard in order to create a whole app. They can just speech the text and then it does everything else in the background. There's levels to it. At its core, it's just you're using these tools where your knowledge is not quite there. You're using it to bridge the gap in order to get something accomplished software code-wise. [0:19:18] SF: Yeah. I think the survey said 77% of developers say vibe coding isn't really part of professional work. Do you think that this is primarily about enabling a new class of user to create things, or focus on certain types of side projects, versus something that we'll see in a professional setting? [0:19:35] EY: Yeah, definitely. I think that for right now though, the phrase itself isn't helping. It just conjures up an immediate reaction. We saw that. In their own words, because this question on the survey was an open response. I let them say whatever they wanted to. Then I used an NLP algorithm to group it altogether. I have responses that were emphatic, "No, I am not vibe coding." [0:20:03] SF: Yeah. Did you get any sentiment analysis on that? I'm sure there's wild things. [0:20:08] EY: There's lots of good stuff there. I think, so another indication, and Jody already mentioned this earlier, too. If vibe coding is associated, I think, with this future tech debt, your co-workers, your peers don't want to hear about you pushing this vibe coding idea, because to them, they're like, "That's just something I have to fix later." We know that, yeah, 45% of developers indicated that debugging AI generated code was their top frustration with using AI tools. [0:20:39] SF: Outside of AI, using the survey, are there any technologies, frameworks or shifts that stood out in the data that people are excited about? [0:20:50] EY: Some of the things that I've noticed as far as the, what's been growing in usage and popularity for technologies. I mean, it's not new, but we see, I created this new section, so we grouped all of the cloud dev ops tools, technologies, platforms together. We see that now that we know when people can focus on it, because a lot of these tools, especially when you look at some of the things that are being used for AI as well, it's old stuff that's being repurposed. It's like, oh, yeah. This is perfect for this use case, too. We are seeing this just like, growing usage of cloud technology. We saw a huge jump in people using Docker and Kubernetes, which again, these have been around for a while. They've been on the survey for a while, but I just think we're now getting to that point where everyone, it's just becoming more standardized. PostgreSQL, again, an old favorite, but is coming back the spotlight because of AI. That's just being repackaged, or re-marketed in a way for that. We saw a huge jump in usage for that this year, too. [0:21:52] SF: There was also in the last year, really that sentiment around PostgreSQL for everything movement. We don't need all these other databases, just use PostgreSQL to solve all these problems. I wonder if that impacted the responses that you saw on the survey? [0:22:05] EY: Yeah, probably. Again, because we have a majority of more experienced developers. They've been around, they've seen like, hey, if you made it this far by career, probably this is a good thing to keep using. It's moving to something that's just brand new. Yeah, other than that, it was also interesting to see. Stack Overflow itself, we have a new section where we asked about which of these new popular tags, what are you interested in as far as the technology goes? While people associate Stack Overflow with the top three HTML questions, JavaScript questions, Python questions, we see a growing interest and a lot of popularity, a lot of people that are using these technologies for those AI tools. A large language models as a tag is growing in usage. MCP growing in usage. We also see RAG as a tag growing in usage. That was pretty cool. It shows, again, it's that temperature check on the developer community that we have access to. [0:23:08] SF: Was there surprises in the survey, things that came up that are different than prior years that you weren't expecting? [0:23:18] EY: Yes. This year, I added a question at the very end. If you made it all the way to the end, which a lot of people did, I just asked general survey feedback like, "Hey, was this too long for you?" Then two, open response, general feedback. I got a lot of feedback. It was pretty cool to see that they took the time. Again, they took the whole survey and then were able to leave feedback. But a lot of that feedback was general annoyance of having to answer so many questions about AI. [0:23:51] SF: Yeah. I think that's some of the things that we touched on earlier. I think there is a - it's natural, where there's such a saturation of something in a market for people to start to become negative towards. It's like, overexposure of even in media is like, okay, we have no game shows, and then we have too many game shows. Or, we have no reality TV, and then we have too much reality TV, and then people become turned off. We're probably going through the downturn in, I don't know, comic book movies right now. It's like, okay, well, we got oversaturated, now we need to contract the market, essentially. Maybe we're seeing some of that with AI as well. What do you think engineering leaders should take away from some of the findings around team productivity, training, tool adoption? How should people be thinking about that? [0:24:35] JB: I think similar to things I've been saying, that in terms of adoption, engineering leaders need to think about, how do they get their engineering teams to leverage AI to create the outcomes that they want, right? Which isn't necessarily to write more code faster. It's to deliver business outcomes, that's to provide solutions. I think that it's important that they give the developers opportunity to experiment with different tools in order to find what works best for them. I think the other thing that is clear is that as people are using AI, they need help and expert help, right? There's a certain point where there's a lack of confidence in the output. One of the things in talking with other peers they talked about as well, "When I get stuck, I just talk to somebody in my network." Well, not everybody has a large network and they don't necessarily have a large community to talk to. That's one of the things that we want to support from Stack Overflow is to be that place where people can go and get help when they get stuck. I think making sure that the engineers have different tools to choose from, look at different ways to leverage them to their advantage, take advantage of their own community, but then their organizations. Then, also, look to places like Stack Overflow to share and learn, because we're all still figuring out how to use it to be really effective. [0:26:00] SF: It's like developing any new skill set. I think the more exposure you have to these tools, the better you get at, aiming them to generate the outputs that you want and the value that you want. That can be hard for a company that's investing in these technologies, expecting some ROI delivery immediately and not factoring into account the training aspect of it. Then, it's also something that's so net new for every company. Well, you can't find somebody with 10 years' experience doing this that you can rely on. Where do you go? Where is your source of information? It's either trial and error, which might take time, or you have to go somewhere else. Historically, when it comes to learning new programming languages, new frameworks, it's really the community that's helped people facilitate that process. It seems like there's a gap in the market when it comes to these AI-powered developer tools. [0:26:52] JB: You need the collective power to figure out how to use the different tools. Some of the things we're trying to figure out, I'm trying to figure out is when you think about implementing AI in your products, a big part of how you implement is creating the appropriate prompts. Who's an expert at writing prompts? There really aren't any. Is it the product manager, because they understand the product and the spec? Is it the MLE, or data scientist because they understand how it works? Is it the software engineer? What I'm saying is it takes a village to really figure those things out, because there are no experts out there, or at least none with tons of experience, because people have been at it less than two years, even if they were there at the beginning. [0:27:39] SF: I think another aspect of this, too, whether you're building AI systems, or you're adopting these types of software tools, is how are you measuring results? What are you seeing from companies with that aspect? I go and I bring in some AI agent coding tool. Are companies thinking about how are they actually going to measure the results of whether this has an impact or not? [0:28:01] JB: Yeah. See a variety of different things and no real standards. A lot of it is just easy. How much code is generated by AI agents, or tools? It's not always easy, but some tools, I think Copilot, for example, will help you understand how much code. You can look at, what is your cycle time? How long is it taking from the time you do a pull request to the time you deliver the code? Are those periods of time shortening? At the end of the day, what we all care about as leaders is delivering results, and are we getting the results to users faster? I think some of the tried and true metrics still apply, whether it's DORA or Lean metrics in terms of understanding how long does it take to deliver value to the user? At the end of the day, that's really what matters, not how much code we generated by AI, or how fast the code was written. It's how quickly are we delivering results? [0:28:58] SF: Yeah. Yeah, exactly. Lines of code is probably not a good measurement, even inside of it. [0:29:04] JB: Oh, no. I got to change everything. [0:29:07] SF: You touched on this a little bit, but how do you see the role of communities, like Stack Overflow evolving in AI, or especially where it's debatable how trustworthy sometimes AI answers are. How are you thinking about that? [0:29:21] JB: Yeah. Spent a lot of time thinking about that, obviously. I think it's different but the same, right? What I mean by that, as we've been talking about, people are still figuring it out. Being able to take advantage of a large community, or population of people is really still our best way of getting to grounded truth. I think that it's what we need to do is we need to find ways to create opportunities to ask questions about some of these things that people are encountering, Making it easier, for example, to take a prompt and put it on a site, like Stack Overflow, create a question and be able to get an answer. That means we've got to adapt how we think about questions and answers on our site. At the same time, one of the things that we're seeing, just somewhat independent of AI, but in part, because of that, people are looking for quicker answers. They're looking for smaller communities even, right? When you have a million people that you're interacting with it, that doesn't always feel like a community, and we're seeing a lot of the newer developers and the non-traditional developers, right? We haven't really talked a lot about that, but more and more people are writing code that aren't traditional developers. How do they find people that they can talk to, that they can get feedback? You see a lot of micro communities out, whether it's Discord, or other things. With Stack Overflow, we're looking at ways to create easier ways for people to connect, identify their community, their tribe, if you will, be able to ask questions and get answers. While it's evolving and different, it's the same. [0:31:03] SF: Yeah. I think what you said about the niche communities, I think that's a trend that we see even inside of developers is across all media, basically, is there's something for everybody, regardless of what your interest is, and you can find your tribe out there on the Internet now and connect with them. People are able to go a lot deeper into their interests. It doesn't necessarily have to be mainstream interests. It's not the days when we had four channels on television and two radio stations. You're bombarded with a plethora of things that you can distract your attention with. [0:31:34] JB: Right. Yeah, you're dating us both now. [0:31:37] SF: Yeah. If we look ahead to, say, the 2030 survey of this, so five years from now, do you think developers are still going to be wrestling with some of these concerned over trust for AI? Or do you think that's something that we'll solve between now and then? [0:31:54] EY: I think, when you talk about trust, there's a couple of different things going on. I think in five years, we will get better tools that don't hallucinate as much as they do now. Probably, yeah. Trust is a human emotion. Trust is something that humans have evolved to learn, discern. When you talk to someone and you have to immediately make a value judgment, do I trust what this person is saying to me? I think, we asked specifically in this survey about why would a developer continue to ask a human being questions if AI can answer all your questions? A lot of the open responses for that indicated that AI is no replacement for a human being. There is no sense of trust with AI. AI cannot learn trust. I want to trust something, or someone that learns trust itself, so it's reciprocated, reciprocal trust. I don't know that we're ever going to see that trust, that specific level of trust get very increased that much. I think the tools are going to get better for sure. In the end, again, going back to this idea of communities and niche communities, and of course, people not spending eight hours a day at work just coding, straight up coding. There's more to the experience and to the creativity and trust and all of those human aspects of building, that is not replaceable. People do not seek to replace it. [0:33:28] JB: I agree. The lack of trust, let's see if I can say this, right. Will approach zero, but it won't get there. In 2030, isn't that far away, I guess. It sounds so far away, but yeah, it's five years, right? I don't think that we'll get to a point where people trust AI 100% in that timeframe. They'll trust it more. They'll still want to validate things. They'll need the human connection. Even thinking about cloud and how old that is, and you still have people that aren't comfortable going to the cloud, right? [0:34:02] SF: Yeah. Security, privacy concerns with the cloud. Exactly. [0:34:04] JB: Yeah, exactly. AI is so much more of a mystery than cloud, right? I think that it's going to be a long time before we get that far in terms of trust. [0:34:16] SF: I also think AI has the challenge, and I think robotics is similar, where there - is doing something that historically feels uniquely human. Whereas, cloud infrastructure is not like, oh, okay. It takes more of a mental leap to anthropomorphize that. Whereas, if I can chat with a system that sounds human enough, then that also evokes a different emotion, I think, in people than an infrastructure system. [0:34:43] JB: Yeah. Yeah. We tried to use pets and cattle with cloud, right? But it's not the same. [0:34:50] SF: Yeah. As we get close to wrapping up here, is there anything from the survey, or other insights that stand out is worth having a conversation around? Anything that we missed? [0:35:04] EY: There's so much. It's hard to focus and hone in on just one set of interesting insights. I will say this. Every year, try to get better at getting the most answers for specific questions. I feel like over the years, we've gotten a lot of great feedback. One, gotten a lot of great feedback about job satisfaction question in the survey. Specifically, job market for software engineers hasn't been great in the last couple of years. They have a place to specifically go talk about that there. I did a little analysis, where we asked about job satisfaction last year and this year. We saw a little bit of an increase last year. One in five developers said, they were happy with their job. This year is wanted for, probably has a lot to do with the job market. When you look at that, I ran an analysis, just a simple regression against how many tools. We asked this year specifically, how many tools are you using at work? You can add a number. When you compare it to job satisfaction, because we know developers are using a lot of tools. Now, there's a whole bunch of AI tools out there, too. We see that people that are satisfied with their jobs, the number of tools that they're using for work, it doesn't affect that. I think that was interesting, mostly because while it must be annoying for a lot of people to have to learn how to use a new tool, if it's mandated, like you were talking about earlier, we have to use this. I think developers like using a lot of tools. I think there's things they are like, "No, I like having this specifically to do this." It's not the number of tools that's really dictating whether they're happy at work, which is good, because I can only see that tool number increasing when we talk about this, yeah, by 2030. I think that that probably will go up a little bit, but it won't be for - I think we'll adapt to that, too. Things are going to be more integrated, hopefully, and just easier. Everything, you log in once and it works and you got your two FA. It's already done and stuff like that. I thought that was pretty cool. [0:37:14] SF: Jody, do you have anything to add? [0:37:16] JB: I mean, I think the one that always stands out for me is the happiness at work. It hasn't changed a ton, but it's gone up, which is good, but it surprises me, right? My expectation, or maybe my hope is there's more - more developers are happy at work, especially early days. I'm dating myself here. People became developers, because they were passionate about what they do, and passionate about writing code and they loved doing that. My hope is that they continue to love the craft, the art of creating new solutions via code, or vibe coding, or whatever the right phrase is. It comes back, I think, a lot to people feeling they have the autonomy and to choose what and how they do it, and what tools they use and how they do it in order to create those outcomes. I suspect that the more we try to dictate, you have to do it this way, it's going to create more and more challenges in terms of that happiness with the workforce. Hopefully, with AI tools, engineers will find ways to adopt it, to automate and get rid of the parts of the job that they don't enjoy and focus on the things that they really do enjoy, which is creating solutions for people. [0:38:32] SF: Yeah, I think that's the idea. That's the goal would be, can you give people back time to spend on the things that they're more interested in, that makes them more happy and satisfied, whether that's solving more complex problems, whatever it happens to be, versus more with more mundane types of things that you need to do. I think that's not just about developers, but that's across the little workforce is the - that's the ideal state. I don't know that we're there yet, but hopefully, we can get there. [0:38:59] JB: Agreed. [0:39:00] SF: Jody Erin, this was fantastic. Thanks so much for doing this. Thanks for coming on the show and cheers. [0:39:05] JB: Thanks. Take care. [END]