You've seen headline after headline. Clickbait YouTube video followed by clickbait article followed by... It's exhausting.
We're being force-fed this story that AI is the end of software developers -- and unfortunately, we're perpetuating it.
Listen to what Sundar Pichai says to Lex Fridman in this interview on AI and programming.
View Transcript
All right, with all of the clickbait titles that exist on the internet with YouTube videos and articles and wherever else, we finally have something with some positive news for software developers and AI. So, in this video, I'm going to be reacting and reviewing the interview clip with Sundar Bachai and Lex Freriedman. And in this interview, they're talking about AI tooling and the effects of that on software engineers. And in every other video or news article or whatever else we see, it's always doom and gloom. Software engineers are about to go extinct. It's the end of the line for software engineers. But in this video, we get a glimpse of Sundar Pachai's perspective on this stuff, especially when it comes to Google, the way forward. And in my opinion, this is totally awesome. So if that sounds interesting, just a reminder, subscribe to the channel and
check out that pinned comment for my courses on domain. Let's jump over to this video and check it out. I have to ask you on the programming front. Uh AI is getting really good at programming. Gemini, both the Aentic and just the LLM has been incredible. So a lot of programmers are really worried that their jobs they will lose their jobs uh how worried should they be and how should they adjust so they can be thriving in this new world? So just to call it out, I like that his question is two parts, right? It's that hey acknowledge that we have a lot of programmers and developers that are kind of fearful for their careers but he doesn't just leave it there and say like you know acknowledge it it's like how should they be thinking about trying to thrive in this world right so
let's see what Sundar has to say where more and more code is written by AI I think a few things looking at Google um you know we've given various stats around like you know 30% of uh code now uses like AI generated suggestions or whatever it is. But the most important metric and we carefully measure it is like how much has our engineering velocity increased as a company due right so you've probably seen this come up in different news articles whether it's for Microsoft where I'm working from Google Facebook Meta right like doesn't matter where the metric that we seem to be getting all of the time is going to be the lines of code that are created by AI personally I've always thought that this was a pretty crappy metric to be honest and when I reflect on this if we go back even
before AI was a thing what I've noticed especially as a net developer is that over time yes there is going to be more and more code that's written not by humans and to give you an example even going back many years if you know about windforms or WPF so basically desktop development using net tools you have stuff that is going to be generated by the computer you You get to use a drag and drop visual editor for putting buttons and text boxes on a form. Lay that all out and what the computer does for you is it generates the code. So you get all of this code that's created. It is source code. You can go read the source code for it and that will still get compiled. Now depending on the application you're looking at, you might have a disproportionate amount of code that
was generated by the computer versus a human. We have the same thing in WPF as well. And if you want one more example of that that's a little bit more modern than wind forms even though I really love my windformms inn net we also have source generators. So instead of some different patterns and practices we have that might rely on reflection we're able to sort of template out code and that way you can use a source generator to go build all of that code for you based on a template and that way the code is written it will get compiled. You don't have to use reflection at runtime for some different scenarios. But the more and more of this kind of stuff that we have introduced, the more and more code we already have being written by computers, never mind AI. So when I hear
this AI metric, to me it sounds kind of silly. The other thing that we don't get to know when we're talking about just this metric is how much of that code that's written by AI had some amount of back and forth with a person, right? So I've always thought the metric was kind of silly. So I like that he's calling out productivity is actually the thing we care about due to AI, right? And it's like tough to measure and we kind of rigorously try to measure it and our estimates are the number is now at 10%. Right? Like now across the company we've accomplished a 10% engineering velocity increase. It's interesting when I when I look at Lex's face, I can't tell if that's an impressed facial expression or only 10%. So, we'll see what has to be said about this using AI, but we
plan to hire engineers, more engineers next year. Right? Did we catch that? You have the CEO of Google saying, "We plan to hire more engineers next year." Right? Anytime this conversation comes up around AI and the impact on software engineers, it is always sounding like the exact opposite. We're going to hear from him in just a moment about what this actually means um in terms of how AI is enabling engineers. But I just want you to keep this in mind because sometimes we're hearing from the other leaders about how much AI is writing more code, the impact of AI on all of these different areas. And I think a lot of us are automatically jumping to therefore we don't need developers. In fact, we've seen some companies in public popular media that have gone, you know, to reduce staff to go in favor of AI
and then backpedal on this very quickly because they're going, "Oh crap, that doesn't really work. So, let's keep watching." Right? So, you because the opportunity space of what we can do is expanding too, right? And so I think hopefully you know for at least the near to midterm for many engineers it frees up more and more of the you know even in engineering and coding there are aspects which are so much fun. You're designing, you're architecting, you're solving a problem. There's a lot of grunt work, you know, which all goes hand in hand. But it hopefully takes a lot of that away, makes it even more fun to code, frees you up more time to create, problem solve, brainstorm with your fellow colleagues and so on, right? So that's that's the opportunity there. And second, I think like you know it'll attract it'll put the
creative power in more people's hands which means people will create more that means there'll be more engineers doing more things. So it's tough to fully predict but you know I I think in general in this moment it feels like you know you know people uh adopt these tools and be better programmers. on those two parts, right? He is saying that he has this perspective that engineers become augmented, right? So, I like and I've seen this firsthand even in my own personal projects and stuff. So, for those of you that are familiar, I build something outside of work that's called Brand Ghost. I'm pretty active on social media. I built Brand Ghost to be able to publish all of my social media content very regularly. Really helps me be on every platform all at once, all of the time. And I've even noticed using AI tools,
especially GitHub Copilot with poll requests, which I am overdue for a video on now at this point. So stay tuned for that. Using that, I've been able to really get through some grunt work that is going to help improve my codebase, but it's like I don't have time to do it. It's not really the top priority, but I can delegate that to AI now. I can have it go clean things up. I can go check out the pull request, review it, doublech checkck things, and then it's done. And so instead of leaving that in my codebase to go erode over time, I can go address it sort of in parallel. There's situations which I'll cover in that video that I have to make where I've gone to bed being like, hm, I just had an idea for a feature, I write it into GitHub as
an issue, assign it to Copilot, and I wake up and there's a first version of an implementation already done for me. So I've really been noticing that I can get through some grunt work much easier. And from a creative perspective, even if I outsource the creativity to AI, it's a first step. I can use that as like a launching point to go create more things. So I really like that. I wanted to acknowledge that from what Sundar was saying. And then the other thing that he mentioned is this perspective that as more people are coming in doing more creative work, there is a need for more engineers. And what I like about this is that it's very counter in my opinion to what a lot of other people are saying or framing up. When we hear these types of conversations with AI coming into the
mix, I don't know the right way to frame this, but it often seems like we're talking about some fixed amount of work that has to get done. There's some bounds around the amount of work. And if you introduce AI to go chip away at that work, that therefore means you have less of a requirement for people. This is the framing that I think people take on. And that's why when you're introducing more AI, you need less people. Now, the reality is, and I've said this on code commute vlogs, so if you haven't checked out code commute, definitely check out that channel. I vlog going to and from work talking about this stuff. But I've said this before that I have never worked anywhere that didn't seem like it had an unlimited amount of work. That means at startups, that means at Microsoft on two different
teams, it's always a matter of trying to prioritize the most important thing. It's not like we run out of work to do. There's always something that is really important. And often times we're battling with trying to not disrupt the most important thing with something else that's even more important. there's so much work to get done. So the way that Sundar is talking about this in my mind is very similar to what I've experienced which is there's just always more stuff to get done. Introducing AI to help with that doesn't mean that all of a sudden developers aren't needed. It just means that we can get through more and more of that which I think is fascinating and I really like that sort of perspective versus the sort of uh bounded amount of work that has some limited context to it. There are more people playing
chess now than ever before, right? So, uh, you know, it feels positive that way to me. At least speaking from within a Google context, uh, is how I would, you know, talk to them about it. I still I just know anecdotally a lot of great programmers are generating a lot of code. So, their productivity, they're not always using all the code just, you know, there's still a lot of editing. But like even for me, I still programming as a side thing. I think I'm like 5x more productive. I don't know where people get these numbers to be honest. Like I don't know how anyone has quantified their own productivity. Like if you were twice as productive, I feel like that would be a ridiculous measure. We we hear about like 10x developers. Lex here saying he's five times more product. I I don't even understand
how people do that, but not trying to pick on Lex. I just don't understand where that number comes from. Let's keep going. I don't I I think that's uh even for a large code base that's touching a lot of users like Google's does. I'm imagining like very soon that productivity should be going up even more. The big unlock will be as we make the agent capabilities much more robust, right? I think that's what unlocks that next big wave. And just to pause on that, um, in other, you know, content that I've created, especially on Code community at this point, I've talked about using, uh, agents to go build code. I have found historically lackluster results that's in Cursor, that's in VS Code, that's in Visual Studio. When I use the agents to go write the code, I often feel like I have to do so
much handholding with the prompts, then I might as well write it myself or go back to just using chat. So in the chat mode, I can ask it things. I feel like in that context, it does a really good job. Can use the output of that. I write the code or like basically apply the code changes. And I found a lot of success with that. When I get into using agents, that's where I find that starts to fall apart because either it has too much context and then it like really kind of waters down the approach. It's like it doesn't know the right patterns to use because it grabbed too much context or it's just like completely wrong and I have to really dial that context in. So, it's just too much handholding in the end. when I go to use co-pilot with agents in
GitHub that has been transformative for me because it seems like yes it has access to all the same context my codebase but it seems like it's doing a better iterative approach so when Sundar is saying you know more agentic capabilities even for me at least when I've gone from using agents in my IDE to in GitHub even that step there has been transformative going from not really helpful to I can if you if I were to show you a screenshot of my git commits over the past 3 weeks, you would see that it's been essentially mostly replaced by co-pilot committing things for me and then me sprinkling in the edits. So truly transformative. I think the 10% is like a massive number like you know if tomorrow like I showed up and said like you can improve like a large organization's productivity by 10%. when
you have tens of thousands of engineers that's a phenomenal number uh and you know that's different than what others site as statistics saying like you know like this percentage of code is now written by AI I'm talking more about like overall productivity the actual productivity right engineering productivity which is two different things and and which is the more important uh metric and but I think it'll get better right and like like you know uh I think there's no engineer who tomorrow if you magically became 2x more productive you're just going to create more things you're going to create more value added things and so I think again he's just kind of reiterating what I was saying a little bit earlier right if you become more productive it's not just just that you get through your work and then you're done forever it's like most engineers
we're going to be able to just keep doing more things so I I don't see it as a a bounded amount of work so just to reiterate that you'll you'll find more satisfaction in your job, right? So, and there's a lot of aspects. I mean, the actual Google codebase might just improve because it'll become more standardized, more um easier for people to move about the codebase because AI will help with that. And therefore, that will also allow the AI to understand the entire codebase better, which makes the engineering aspect. That's I've been using cursor a lot. Yeah. Uh as a way to program with Gemini and other models is like it one of its powerful things is it's aware of the entire codebase. So yeah, this this point from Lex is something that I'm trying to personally experiment with a little bit more in my
own code bases. So given that in my own personal projects, I'm living and breathing in them. I don't do a ton of documentation, I have architectural patterns that will drift. In the video that I'll make on brand ghost where I talk about using GitHub copilot uh in agent mode with the pull requests, I've had it go through and I've had it start documenting things and my goal is that as it's cleaning up the code base, as it's documenting some things that going forward, it actually enables the LLM to be more and more effective. So, as it's going through cleaning up, it's also hopefully enhancing its future abilities to do a better job. So, one thing that I try to do is when it does the wrong thing, I ask it, "How could I improve the prompt for this for next time so I can add
that to the co-pilot instructions?" So, that's one thing. And then sometimes I'll say, "Make sure that you update the documentation so that the next person or LLM coming through here knows how to apply this type of change effectively." So, not only is it cleaning up and standardizing the code, I'm trying to get it to write documentation or other information that helps the next pass through. Allows you to ask questions of it. It allows the agents to move about that code base in a really powerful way. I mean, that's a huge unlock. Think about like, you know, migrations, refactoring old coal bases. Refactoring. Yeah. I mean think think about like you know once we can do all this in a much better more robust way than where we are today. I think in the end everything will be written in JavaScript and run run in Chrome.
I think no that's actually not going to happen. It better not. Let's keep going. It's all going to that uh direction. I mean just for fun Google has legendary code coding interviews uh like rigorous interviews for the engineers. How can you comment on how that has changed in the era of AI? It's just such a weird uh you know the whiteboard interview I I assume is not allowed to have some prompts. Such a uh good question. Look, I do think you know we're making sure you know we'll we'll introduce at least one round of in-person interviews for people just to make sure the fundamentals are there. I think they'll end up being important but it's an equally important skill. Look, if you can use these tools to generate better code uh like you know I think I think that's an asset and so uh you
know I think uh so overall I think it's a it's a massive positive and there we go. So on that last note, I don't know if we got a full answer from Sundar, but I suspect that most big tech companies at this point in time certainly have not shifted to using uh you know AI focused interviews. I'm pretty sure I've seen in Meta's uh documentation for some of their interviews. for example, I think they still mention like, hey, we value things like AI, but when we're interviewing, they have a clause that basically says like, we want to interview you, not the AI. However, with that said, it's kind of interesting that Lex brought it up, right? Because as we shift more and more to that, I can absolutely imagine that having AI being incorporated into the interview process will become more and more relevant. Because
if these are tools that we're going to be using as developers and we're expected to be using them, they become more commonplace, I can imagine that being able to demonstrate your effectiveness of those tools will be important. As Synindor said though, it is still very valuable of course to have foundations in software engineering and software development. So we don't want to just say like cover our eyes and use chat GBT and get code output. We need to understand what's going on. So overall personally when I watch this I got super excited because I said heck yeah I have something to share with people if they haven't seen it already that's not doom and gloom. You have the boss of Google telling you that AI is here to help augment developers. They are going to continue to hire software engineers. I think for the other companies
like Microsoft or Facebook that aren't saying this outright, this is the message that I would be taking away from this. This is a bit of confirmation bias for me because this has been my opinion all along. So, I was very happy to see Sundar talk about this. But I'm curious. I would love to hear your perspective on this. Do you agree with what he's saying? Do you believe in it? Are you seeing other things? Are you experiencing other things? Was this enough to shift your mind from the doom and gloom? Or are you still thinking that software engineering is going extinct? Love to hear from you. Thanks for watching and I'll see you next time.