Steve Yegge discusses how AI is reshaping software engineering, the rise of "vibe coding," and why developers must adapt to a rapidly changing craft.
Steve Yegge has spent decades writing software and thinking about how the craft evolves. From his early years at Amazon and Google, to his influential blog posts, he has often been early at spotting shifts in how software gets built. In this episode of Pragmatic Engineer, I talk with Steve about how AI is changing engineering work, why he believes coding by hand may gradually disappear, and what developers should focus on, instead.
We discuss his latest book, Vibe Coding, and the open-source AI agent orchestrator he built called Gas Town, which he said most devs should avoid using. Steve shares his framework for levels of AI adoption by engineers, ranging from avoiding AI tools entirely, to running multiple agents in parallel. We discuss why he believes the knowledge that engineers need to know keeps changing, and why understanding how systems evolve may matter more than mastering any particular tool.
We also explore broader implications. Steve argues that AI's role is not primarily to replace engineers, but to amplify them. At the same time, he warns that the pace of change will create new kinds of technical debt, new productivity pressures, and fresh challenges for how teams operate.
Key observations from Steve
1. A prototype-as-product model is replacing the build-then-dump cycle
At Anthropic, Steve says teams create many prototypes rapidly and just ship the best one. Claude Cowork reportedly went from prototype to launch in just 10 days. Meanwhile, "slot machine programming" – building 20 implementations and picking the winner – is becoming normal practice for teams.
2. The IDE could be evolving into a conversation and monitoring interface, not a code editor
Steve sees tools like Claude Cowork as the return of the IDE, focused on managing agent workflows above coding by hand. He predicts these new IDEs will focus on conversations with AI agents and monitoring them.
Side note: I'm not sure I foresee conversational tools appearing just yet, or IDEs turning into such tools – but we do see tools like Claude Code being wildly popular among devs, as per our latest AI tooling survey.
3. Reading ability is becoming a blocker for wider AI adoption
Some struggle with walls of text that current AI tools produce, and Steve predicts that in the very near future, most people will program by talking to a visual avatar, not reading terminal output because he observes that five paragraphs is already a lot to read for many devs.
4. AI coding has a spectrum, and most engineers trend near the bottom
Steve describes eight levels, from "no AI" to "multi-agent orchestration," with most engineers currently at levels 1–2: asking an IDE for suggestions and carefully reviewing output. He suspects such engineers will be left behind.
5. Monolithic codebases are a big blocker to AI adoption in enterprises
AI agents have a ceiling of between roughly half a million to a few million lines of code which they can work with, effectively. If your codebase is a monolith that won't fit in a context window, AI agents won't work well with it.
6. What software engineers need to know keeps changing
In the 1990s, any decent software engineer knew Assembly, and today almost no decent developer knows it because Assembly has long been superseded by technical progress. What engineers "need" to know these days is different from the '90s and that process continues with AI, changing the parts of the craft that are essential for devs. We grumble about this but that won't change anything by itself.
7. SaaS companies that don't offer platforms and APIs will be out-competed
Steve uses Zendesk as an example: if your product doesn't expose APIs, then AI-native companies will just build bespoke replacements. "If Zendesk doesn't make themselves a platform, then they'll put themselves out of existence."
8. There's a "Dracula Effect" where AI-augmented work drains engineers faster than traditional work
Steve says you may only get three daily productive hours at max speed, but during that time, you could produce 100x more output than before.
9. Even if AI progress stalls, it's worthwhile getting proficient at working with parallel agents
Steve argues that since there's a model as capable as Opus 4.5 is, we don't need smarter models but better orchestration layers. The worst outcome for someone who invests in learning AI tools is that they gain a skill set that stays useful, whether the models improve or not!
The conversation
Before we start
We recorded this episode with Steve, in Utah, early February, when both of us attended Marin Fowler's The Future of Software Development workshop. Unfortunately, the audio recording for the episode turned out to be of poor quality, so I published a write-up of this conversation one month ago, as a deepdive: Steve Yegge on AI Agents and the Future of Software Engineering. The article captured the essence of what Steve shared, but it felt like a shame not to be able to share the conversation, and just how animated and excited Steve got when talking about the software engineering craft. Thanks to the help of software engineer Tatsuhiko Miyagawa and audio post-production software Auphonic, we managed to fix all audio issues, and you can enjoy the full episode, including parts that the deepdive omitted.
Steve's latest projects
Steve has been working on several projects that showcase his vision for AI-assisted development. His book Vibe Coding explores how developers can work with AI agents to build software more efficiently. He also created Gas Town, an open-source AI agent orchestrator that he describes as something most developers should avoid using – a deliberately provocative stance that reflects his belief that most developers aren't ready for the complexity it introduces.
Shifts in what engineers need to know
One of the most striking points Steve makes is that the fundamental knowledge base for software engineers keeps evolving. He points out that in the 1990s, any competent software engineer knew Assembly language. Today, almost no decent developer knows it because Assembly has been superseded by technical progress.
This pattern continues with AI. What engineers "need" to know these days is different from the '90s, and that process continues with AI, changing the parts of the craft that are essential for devs. We grumble about this but that won't change anything by itself.
Steve argues that understanding how systems evolve may matter more than mastering any particular tool. The ability to adapt and learn new paradigms is becoming the core competency, rather than deep expertise in any specific technology.
Steve's current AI stance
Steve describes himself as being at the forefront of AI adoption in software development. He's moved beyond simply using AI tools for code completion or generation – he's building systems that orchestrate multiple AI agents working in parallel.
He's observed that most engineers are still at the early stages of AI adoption, using tools like GitHub Copilot for basic suggestions and carefully reviewing all output. He suspects engineers who remain at these lower levels will be left behind as the industry moves forward.
The problem of too many people
One of the more provocative points Steve makes is about the changing nature of software development teams. He suggests that as AI tools become more capable, the traditional model of large development teams may become obsolete. Instead, small teams or even individuals augmented with AI could accomplish what previously required dozens of engineers.
This has implications for hiring, team structure, and how companies organize their engineering efforts. It also raises questions about what happens to the many engineers who may find their roles fundamentally changed or eliminated.
Why AI results lag in business
Steve discusses why businesses often don't see immediate results from AI adoption, even when the technology is demonstrably more capable. Part of this is organizational inertia – companies have established processes, workflows, and ways of measuring productivity that don't easily accommodate AI-augmented work.
There's also a learning curve. Teams need time to figure out how to effectively integrate AI tools into their workflows, and this experimentation period can look like reduced productivity even when teams are actually building valuable new capabilities.
The 'Bitter Lesson' explained
Steve references the "Bitter Lesson" from AI research, which observes that general methods that leverage computation are ultimately the most effective. This means that approaches that try to encode human knowledge and expertise into systems tend to be outperformed by approaches that simply scale up computation and learning.
For software engineers, this suggests that trying to build AI systems that perfectly mimic human reasoning may be less effective than building systems that can learn and adapt through massive amounts of computation and data.
The future of software development
Looking ahead, Steve sees software development becoming more about orchestration and less about manual coding. The IDE as we know it may evolve into a conversation and monitoring interface, where developers manage AI agents rather than write code directly.
He also predicts that the ability to read and understand code may become less important than the ability to communicate effectively with AI systems. If AI can generate and explain code on demand, the bottleneck may shift to understanding requirements and constraints rather than understanding implementation details.
Where languages stand
Steve discusses how programming languages are evolving in an AI-dominated world. While traditional languages won't disappear overnight, he sees a trend toward languages and tools that are more amenable to AI assistance and generation.
This might mean languages with simpler syntax, more predictable patterns, or better tooling support for AI-assisted development. It could also mean new languages designed specifically for AI-human collaboration rather than human-only development.
Adapting to change
Throughout the conversation, Steve emphasizes that the most important skill for software engineers is the ability to adapt to change. The specific tools and technologies will keep evolving, but the ability to learn new paradigms and integrate new capabilities will remain constant.
He suggests that engineers who resist change or cling to traditional ways of working will find themselves increasingly marginalized. Those who embrace new tools and approaches, even when they're uncomfortable or unfamiliar, will be best positioned for success.
Steve's predictions
Steve makes several predictions about the near future of software development:
- AI-augmented development will become the norm within 2-3 years
- Traditional coding by hand will become a niche skill
- The most valuable engineers will be those who can effectively orchestrate AI agents
- Companies that don't adapt to AI-augmented development will be outcompeted
- The pace of software development will continue to accelerate
He's particularly bullish on the potential for AI to democratize software development, making it possible for people without traditional programming backgrounds to create sophisticated applications.
Final thoughts
This conversation with Steve Yegge provides a fascinating glimpse into how AI is reshaping software engineering. His perspective, shaped by decades of experience at companies like Amazon and Google, offers valuable insights for developers trying to navigate this rapidly changing landscape.
The key takeaway is that change is coming whether we like it or not. The question isn't whether AI will transform software development, but how quickly and how completely that transformation will occur. Engineers who start adapting now will be best positioned to thrive in the AI-augmented future of software development.
For those interested in diving deeper, Steve's book Vibe Coding and his open-source project Gas Town provide concrete examples of how he envisions this future unfolding. Whether you agree with all of his predictions or not, his perspective is well worth considering as we collectively navigate the future of our craft.

Comments
Please log in or register to join the discussion