I'm cautious about generative AI for the same reasons a lot of people are. It was created by unethical means and exploits people, communities, and the environment. It didn't have to be this way, none of this was inevitable. I like to imagine a world where AI was created more responsibly because I think these tools are powerful and important. I would be excited about the potential of these tools if they didn't require the dis-empowerment of people around the world. It's an uncomfortable duality; AI as it exists today is both wildly exploitative and incredibly useful.
Generative AI tools are particularly potent in the world of software engineering. I've seen first hand how powerful these tools are, and it's undeniable that they'll be a staple in the field for years to come. In my opinion, software development is the first killer app for generative AI.
So setting aside the ethical concerns of these technologies for a moment, what are the implications for software developers adopting AI coding tools?
I can't navigate anymore
Everything you do is 'use it or lose it'. Whether it's lifting weights, remembering dates, or navigating in the car, when you stop practicing a skill it deteriorates. I've certainly experienced this with my sense of direction. In high school I could navigate in the car easily. But as the years went on I started using Google Maps more and more. Now with Apple Care Play, it's integrated right into my dashboard. Now, my sense of direction is trash and if Google Maps shutdown I'd be perpetually lost and confused.
This is the crux of the issue for me when it comes to adopting coding tools as a software engineer. There is a lot of grunt work I don't mind automating, but these tools go farther than that. These tools can automate skills I'm not sure I want to lose.
As part of a project I was working on recently I wanted to build a command line program with a pretty TUI that let me try an audio based ML model using my computer's microphone. This was something tangential to the project I was working on and would have taken me a full day to build on my own. Instead, I asked GitHub Copilot to do it for me and it was ready and working within an hour. This is incredibly useful and helped me test the ML model I was building, but it also took away the opportunity for me to learn about building TUIs and running ML models on a live audio stream.
And that's the tradeoff. These AI coding tools can do things for you, but in the process you don't get to do them yourself. You lose out on the learning and practice that keeps those coding and design muscles alive.
Creativity and creative problem solving come from being able to make connections between disparate concepts in your mind. You can't know ahead of time what you need to learn about to come up with creative solutions in the future, it's in going through life learning as much as possible that we collect the connections we need to breed creativity. When you use these AI tools to solve problems for you, you can lose out on collecting experiences and concepts that help you be creative.
So what?
You can argue that, if these tools are capable of doing things for you then maybe they're not worth learning and practicing in the first place. If an AI coding tool can create a good TUI for me, why should I learn how to do it myself? Why should I care?
In an undergraduate engineering program, you learn a lot of core principles and derivations of those principles to practical equations and applications. For homework, you work through these derivations and practice their application by hand. In the real world, engineers are often using software that does this for them. The engineers aren't writing out equations by hand, they're using simulation tools.
The point of learning the core principles and derivations is that they teach you the fundamentals of what the software tools are doing. Similarly, the point of homework and doing things by hand is that it helps you learn the material and internalize the principles and patterns. It's insufficient to simply listen to a lecture, to really learn the material you need to work through it yourself.
I'd argue that in software engineering it's the same thing. While day to day, these AI coding tools can do most of the work for you, to really be a good engineer you need to understand the principles and concepts of the system you're building and you need to understand what these tools are doing. To build this foundation, you need to work through the material yourself.
You learn by going deep and pushing through the friction of doing things by hand, especially if it's something you've never done before. My concern is that, by using these tools, I'm not learning and practicing the fundamentals I need to be good at my job. I'm missing out on opportunities to be inspired for future creative problem solving.
Make it hard on purpose
Now, I don't think the answer is to never use these tools. AI coding tools are incredibly powerful and helpful. I think it's incumbent on us as software engineers to be mindful of how we're using these tools, and consciously introduce friction to make sure we're still learning and maintaining our engineering muscles.
One way to do this is to periodically turn off agentic capabilities. Every once in a while, especially if something is new to you, turn off the features that let an AI tool write code for you. Instead, have it help you learn to do things yourself by asking questions. This practice, and forcing yourself to do something by hand helps you learn the underlying concepts of how as system operates and leaves you better able to use AI coding tools going forward.
Be aware of how these tools can be used to make your life better and build better software, but at the same time, be mindful of what you're giving up. Be mindful of the muscles you aren't exercising and take time every now and again to work them. As these tools become more competent, I think the engineers who'll stay relevant are the ones who use them to learn more and become better over time. I think if we simply let the tools do our job for us we'll quickly become unemployable and obsolete.
P.S. - For students
If you're a computer science student, especially if you're in undergrad, I'm not convinced you should use AI coding agents at all. I think you'll learn a lot more writing code out by hand than letting an AI do it for you. Even if it feels tedious, the friction is the point. The friction is how you learn. You'll be better prepared to enter a world that uses AI coding tools when you understand how to do it yourself first.
I think AI tools can be a useful supplement, especially for explaining concepts you're having trouble with, but I recommend stopping short of letting them write anything for you, whether it's code or otherwise. Writing is thinking and you need to do the thinking yourself in order to learn.
References


