A veteran developer reflects on how the craft he loved has transformed over four decades, from hands-on machine mastery to AI-assisted abstraction, and what remains when the wonder feels different.
I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine. I understood that machine completely. Every byte of RAM had a purpose I could trace. Every pixel on screen was there because I'd put it there. The path from intention to result was direct, visible, and mine.
Forty-two years later, I'm sitting in front of hardware that would have seemed like science fiction to that kid, and I'm trying to figure out what "building things" even means anymore.
This isn't a rant about AI. It's not a "back in my day" piece. It's something I've been circling for months, and I think a lot of experienced developers are circling it too, even if they haven't said it out loud yet.

The era that made me
My favourite period of computing runs from the 8-bits through to about the 486DX2-66. Every machine in that era had character. The Sinclair Spectrum with its attribute clash. The Commodore 64 with its SID chip doing things the designers never intended. The NES with its 8-sprite-per-scanline limit that made developers invent flickering tricks to cheat the hardware. And the PC — starting life as a boring beige box for spreadsheets, then evolving at breakneck pace through the 286, 386, and 486 until it became a gaming powerhouse that could run Doom.
You could feel each generation leap. Upgrading your CPU wasn't a spec sheet exercise — it was transformative. These weren't just products. They were engineering adventures with visible tradeoffs. You had to understand the machine to use it. IRQ conflicts, DMA channels, CONFIG.SYS and AUTOEXEC.BAT optimisation, memory managers — getting a game to run was the game. You weren't just a user. You were a systems engineer by necessity.
And the software side matched. Small teams like id Software were going their own way, making bold technical decisions because nobody had written the rules yet. Carmack's raycasting in Wolfenstein, the VGA Mode X tricks in Doom — these were people pushing against real constraints and producing something genuinely new. Creative constraints bred creativity.
Then it professionalised. Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible.
But it wasn't just the craft that changed. The promise changed. When I started, there was a genuine optimism about what computers could be. A kid with a Spectrum could teach themselves to build anything. The early web felt like the greatest levelling force in human history. Small teams made bold decisions because nobody had written the rules yet.
That hope gave way to something I find genuinely distasteful. The machines I fell in love with became instruments of surveillance and extraction. The platforms that promised to connect us were really built to monetise us. The tinkerer spirit didn't die of natural causes — it was bought out and put to work optimising ad clicks. The thing I loved changed, and then it was put to work doing things I'm not proud to be associated with. That's a different kind of loss than just "the tools moved on."
But I adapted. That's what experienced developers, human beings, do.
The shifts I rode
Over four decades I've been through more technology transitions than I can count. New languages, new platforms, new paradigms. CLI to GUI. Desktop to web. Web to mobile. Monoliths to microservices. Tapes, floppy discs, hard drives, SSDs. JavaScript frameworks arriving and dying like mayflies. Each wave required learning new things, but the core skill transferred. You learned the new platform, you applied your existing understanding of how systems work, and you kept building. The tool changed; the craft didn't. You were still the person who understood why things broke, how systems composed, where today's shortcut became next month's mess.
I've written production code in more languages than some developers have heard of. I've shipped software on platforms that no longer exist. I've chased C-beams off the shoulder of Orion. And every time the industry lurched in a new direction, the experience compounded. You didn't start over. You brought everything with you and applied it somewhere new. That's the deal experienced developers made with the industry: things change, but understanding endures.
This time is different
I say that knowing how often those words have been wrong throughout history. But hear me out. Previous technology shifts were "learn the new thing, apply existing skills." AI isn't that. It's not a new platform or a new language or a new paradigm. It's a shift in what it means to be good at this.
I noticed it gradually. I'd be working on something — building a feature, designing an architecture — and I'd realise I was still doing the same thing I'd always done, just with the interesting bits hollowed out. The part where you figure out the elegant solution, where you wrestle with the constraints, where you feel the satisfaction of something clicking into place — that was increasingly being handled by a model that doesn't care about elegance and has never felt satisfaction.
Cheaper. Faster. But hollowed out.
I'm not typing the code anymore. I'm reviewing it, directing it, correcting it. And I'm good at that — 42 years of accumulated judgment about what works and what doesn't, what's elegant versus what's expedient, how systems compose and where they fracture. That's valuable. I know it's valuable. But it's a different kind of work, and it doesn't feel the same.
The feedback loop has changed. The intimacy has gone. The thing that kept me up at night for decades — the puzzle, the chase, the moment where you finally understand why something isn't working — that's been compressed into a prompt and a response. And I'm watching people with a fraction of my experience produce superficially similar output. The craft distinction is real, but it's harder to see from the outside. Harder to value. Maybe harder to feel internally.
The abstraction tower
Here's the part that makes me laugh, darkly. I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they "didn't really know what was going on anymore." And I thought: mate, you were already so far up the abstraction chain you didn't even realise you were teetering on top of a wobbly Jenga tower.
They're writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that's making system calls to an OS kernel that's scheduling threads across cores they've never thought about, hitting RAM through a memory controller with caching layers they couldn't diagram, all while npm pulls in 400 packages they've never read a line of. But sure. AI is the moment they lost track of what's happening.
The abstraction ship sailed decades ago. We just didn't notice because each layer arrived gradually enough that we could pretend we still understood the whole stack. AI is just the layer that made the pretence impossible to maintain.
The difference is: I remember what it felt like to understand the whole machine. I've had that experience. And losing it — even acknowledging that it was lost long before AI arrived — is a kind of grief that someone who never had it can't fully feel.
What remains
I don't want to be dishonest about this. There's a version of this post where I tell you that experience is more valuable than ever, that systems thinking and architectural judgment are the things AI can't replace, that the craft endures in a different form. And that's true. When I'm working on something complex — juggling system-level dependencies, holding a mental model across multiple interacting specifications, making the thousand small decisions that determine whether something feels coherent or just works — I can see how I still bring something AI doesn't. The taste. The judgment. The pattern recognition from decades of seeing things go wrong.
AI tools actually make that kind of thinking more valuable, not less. When code generation is cheap, the bottleneck shifts to the person who knows what to ask for, can spot when the output is subtly wrong, and can hold the whole picture together. Typing was never the hard part.
But I'd be lying if I said it felt the same. It doesn't. The wonder is harder to access. The sense of discovery, of figuring something out through sheer persistence and ingenuity — that's been compressed. Not eliminated, but compressed. And something is lost in the compression, even if something is gained.
The fallow period
I turned 50 recently. Four decades of intensity, of crafting and finding satisfaction and identity in the building. And now I'm in what I've started calling a fallow period. Not burnout exactly. More like the ground shifting under a building you thought that although ever changing also had a permanence, and trying to figure out where the new foundation is.
I don't have a neat conclusion. I'm not going to tell you that experienced developers just need to "push themselves up the stack" or "embrace the tools" or "focus on what AI can't do." All of that is probably right, and none of it addresses the feeling. The feeling is: I gave 42 years to this thing, and the thing changed into something I'm not sure I recognise anymore. Not worse, necessarily. Just different. And different in a way that challenges the identity I built around it and doesn't satisfy in the way it did.
I suspect a lot of developers over 40 are feeling something similar and not saying it, because the industry worships youth and adaptability and saying "this doesn't feel like it used to" sounds like you're falling behind. I'm not falling behind. I'm moving ahead, taking advantage of the new tools, building faster than ever, and using these tools to help others accelerate their own work. I'm creating products I could only have dreamt of a few years ago.
But at the same time I'm looking at the landscape, trying to figure out what building means to me now. The world's still figuring out its shape too. Maybe that's okay. Maybe the fallow period is the point. Not something to push through, but something to be in for a while.
I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic. I'm fifty now, and the magic is different, and I'm learning to sit with that.
Photo by Javier Allegue Barros on Unsplash

Comments
Please log in or register to join the discussion