DEV Community

Cover image for The Untold issues with AI job-takeover theory ( chapter 1)

The Untold issues with AI job-takeover theory ( chapter 1)

Tiago Nobrega on April 26, 2026

TLDR: AI is an abstraction for coding. And like all abstractions, it never replaces the need for the underlying knowledge. The Theory ...
Collapse
 
miketalbot profile image
Mike Talbot ⭐

IDK, the advances that happened from last April to this April are pretty fundamental, sure it's not perfect and sure it would be bad in the hands of a total amateur, but I'm not sure that's lasting. Right now, in a well architected and documented code base, below average developers can build something amazing, presuming they are using tools across the piece - architecture review, code review, plan review, ideation support etc. That's right now. On greenfield it's a mess still, but that won't last.

Collapse
 
tiagobnobrega profile image
Tiago Nobrega

I totally agree. Well architected and documented code bases are easy to expand if you are really good at pattern matching, which LLMs are. But with every change there's a possibility to diverge from that state. Someone with deep understanding of such practices is required to prevent that. How long can it last ? A lot maybe. Again, if the requirements are simple enough, that code base is probably fine.

Collapse
 
theeagle profile image
Victor Okefie

You're right. AI is just another layer. Like ORMs didn't kill SQL. They just made the pain of not knowing SQL show up differently. Same with AI. It'll write code fast. But when it breaks, you still need to know why. The leak doesn't go away. It just moves.

Collapse
 
varsha_ojha_5b45cb023937b profile image
Varsha Ojha

Interesting perspective. A lot of the conversation focuses on replacement, but in practice it feels more like reshaping how work gets done. The bigger shift is in how people adapt their thinking and workflows around AI.

Collapse
 
tiagobnobrega profile image
Tiago Nobrega

Exactly. How to make the best use of it? When to rely on it or don't?

Collapse
 
varsha_ojha_5b45cb023937b profile image
Varsha Ojha

For me, AI works best when the task is repetitive, research-heavy, pattern-based, or needs a first draft. But I’d be careful relying on it for judgment, context, ethics, final decisions, or anything where being “technically correct” is not enough.

So maybe the question is less “will AI replace people?” and more “which parts of the work should stay human by design?”

Collapse
 
itskondrat profile image
Mykola Kondratiuk

leaky abstractions is a solid lens but it cuts both ways. when abstractions fail at scale, teams build a better layer on top, not drop down. AI will probably evolve the same way - abstracting its own failures into the next layer.

Collapse
 
tiagobnobrega profile image
Tiago Nobrega

Sure, But again, the leak won't stop. It will just allow you to get further without the knowledge. And it adds up to the point where you need to ask yourself, does it worth?

Collapse
 
itskondrat profile image
Mykola Kondratiuk

that inflection point is real — and it hits PMs faster than engineers. we are making scope and risk calls without the depth to validate AI outputs. at some point it stops being about abstraction and becomes: do you understand what you are shipping.

Collapse
 
gimi5555 profile image
Gilder Miller

Great point about leaky abstractions!
This is exactly what I see working with ML systems - AI tools are helpful until they hit edge cases, then suddenly you're digging into the fundamentals. The ORM comparison is spot on. It's not about replacing developers, but changing where we actually need to apply our expertise.
Keep the posts coming!💪

Collapse
 
tiagobnobrega profile image
Tiago Nobrega

Thanks! That also means changing what skill to craft as a developer. Fast typing is becoming less useful.