DEV Community

Cover image for AI Didn't Make Software Engineering Easier. It Made the Hard Parts Harder.

AI Didn't Make Software Engineering Easier. It Made the Hard Parts Harder.

Praveen Rajamani on May 14, 2026

When I started using AI tools seriously across my side projects, I expected the work to get easier. AI handles the boilerplate, I focus on the inte...
Collapse
 
miketalbot profile image
Mike Talbot ⭐

I'm finding work generally much easier and the ability to deliver significantly increased. The dopamine patterns are totally different and I think that can mean you "try too much". I also think the massively increased output puts much more stress on other parts of a business (like sales, marketing, ops).

Why did the Google engineer get paged at 2am? That wasn't an AI issue, that was a Google strategy and resourcing issue.

Collapse
 
iampraveen profile image
Praveen Rajamani

Glad it feels easier for you .! But I think the real cost shows up later - more output means more surface area to maintain, more regression to catch, more edge cases nobody thought about at speed. The delivery feels faster but the tail gets longer. That is where the exhaustion quietly builds up.

Collapse
 
yune120 profile image
Yunetzi

Great take. AI speeds up some tasks but reveals gaps in our mental models. The win: clear specs, sharper prompts, better tests, and a human in the loop to sanity-check the outputs.

Collapse
 
iampraveen profile image
Praveen Rajamani

So true .! Clear specs and a human sanity-check are still non-negotiable 😄
AI just helps you build the wrong thing faster if you are not careful .!

Collapse
 
hr_pulsar profile image
HR Pulsar

This is the part of the AI discussion that feels massively underexplored to me: can engineers still become really good at the hard 20% if they stop regularly doing the repetitive 80% that historically built the experience in the first place?

Because that “boring” work wasn’t just labor. It was training data for humans.

If AI removes too much friction too early, we may accidentally optimize away the path that creates senior engineers.

Collapse
 
hr_pulsar profile image
HR Pulsar

Also — the story about the Google engineer getting paged at 2AM isn’t really an AI problem to me. That’s bad corporate culture and broken communication boundaries wearing an AI-shaped costume.

A lot of companies would happily turn any productivity tool into an excuse for permanent urgency. AI just exposed it faster.

So yeah, I fully agree with your point about automation shifting work from routine execution toward high-cost decision making. But that’s also a labor ethics and regulation question now.

Feels like this is the real challenge for our generation:
adapt fast enough to survive the shift, while still making the new reality livable for actual people — not just AI vendors and enterprise slide decks.

Collapse
 
iampraveen profile image
Praveen Rajamani

Yeah, the 2AM paging was never really about AI, that is just a company pushing hard and calling it normal. AI just made it easier to justify. But your point about making the new reality livable, that's the bit most people are ignoring. Everyone's talking about how to adapt fast enough. Fewer people are asking whether the pace itself is sustainable. That feels like the more important conversation right now

Thread Thread
 
hr_pulsar profile image
HR Pulsar

Maybe I’m just getting old, but the pace for the last 5–10 years feels borderline unsustainable 🙃

A lot of people don’t want to spend their entire lives endlessly re-optimizing themselves for the newest stack, framework, AI tool, productivity ritual, and corporate trend cycle. Some people just want to do good work, live comfortably, and not feel like a replaceable productivity horse in an infinite benchmark race.
I think that’s a pretty human response to an industry that increasingly treats exhaustion like ambition.

Thread Thread
 
iampraveen profile image
Praveen Rajamani

You are not getting old, I think a lot of people feel this way but don't say it because it sounds like complaining 😅 But wanting to do good work and actually have a life isn't a weird ask. The industry just made constant self-upgrading feel like the default. And at some point, you have to ask - upgrade for what exactly? 🤔 A career should be something you can actually live in, not just survive.

Collapse
 
iampraveen profile image
Praveen Rajamani

This one really stuck with me. That repetitive work wasn't just boring; it was how you built instinct. You learned to spot bugs, understand systems, and think ahead by actually doing the slow stuff first. If juniors skip all of that, they will have gaps they won't even notice - until something breaks in production and they don't know where to start. AI might actually be harder on new engineers than on experienced ones. The ladder is disappearing while they're still trying to climb it.

Collapse
 
joyrambhattacharjee profile image
Joyram Bhattacharjee

somehow true, somehow false

Collapse
 
albernaz_ profile image
Beatriz Albernaz

This resonates a lot from a security perspective. The speed AI enables doesn't shrink the attack surface, it widens it. The gap between "shipped" and "audited" grows with every sprint, and most teams don't feel that until something breaks badly.
Pentesting used to catch issues that built up slowly. Now that accumulation is on fast-forward and nobody's slowing down to check.
Your point about knowing when not to use AI is spot on. Auth, permissions, data isolation, payments -> write it yourself, read every line, own it before it goes to prod. The two hours you save are not worth the breach disclosure email. 🙌

Collapse
 
iampraveen profile image
Praveen Rajamani

The security angle is something I did not cover, but you are so right. Speed without review is just technical debt with a shorter fuse. And you said it perfectly - auth, payments, data isolation, write it yourself. Those are the places where it looked fine is the most dangerous sentence in engineering. The breach disclosure email is a really good way to put it. That two hours saved can cost months.

Collapse
 
audioproducer-ai profile image
AudioProducer.ai

The "comprehension debt" framing in your data-fetching example matches something we see on the audio side at AudioProducer.ai, where the deliverable is audio rather than text and the comprehension cost shows up even harder. Authors run Auto-Assign on a chapter, the AI tags every line with a speaker and the audio renders cleanly, and the writer's mental model of the cast structure never gets built; three chapters later when a minor character returns and the voice is wrong, they don't know which line was the character's first appearance to fix. You can't grep an audio file, so the recovery path is text editor mode plus per-line re-tagging, which is the slow path that builds the instinct AI just skipped. HR Pulsar's "training data for humans" point lands the same way here: the old 80% of hand-tagging dialogue was annoying but it was how writers learned to see their book as a cast structure rather than a stream of prose. We've started thinking of the text-editor-mode toggle as the deliberate-slowdown surface, not the productivity tool, because it's the only place where the writer can refuse the speed and stay in contact with the manuscript.

Collapse
 
leob profile image
leob

This:

"Sometimes writing code slowly is what helps you understand the system"

That's why I'm advocating "right and left" (in comments here on dev.to) to still write some code manually ...