I’ve been using AI for months now.
Not casually, seriously.
And I’ve started noticing something uncomfortable.
The Observation
Everyone is busy.
...
For further actions, you may consider blocking this person and/or reporting abuse
Creating something new and valuable is impossible for AI to do alone. It requires thinking and sadly most people are consumers not creators. Nowadays, social media created a generation of slaves to the dopamine rush. It eats up energy that should be better used for creating something valuable at least for oneself.
You raise an important point about intent and usage.
AI can generate and combine ideas, but real value comes from human direction, judgment, and purpose. Tools don’t create by themselves, people do.
And yes, attention is a limited resource. Those who choose to create instead of just consume will benefit the most from these tools.
Unfortunately, I see many colleagues giving up on thinking and mindlessly relying on AI to do the job for them. This results in a massive amount of low-quality output generated quickly, without proper thought, and often without even checking.
As a Technical Writer, I see tons of generated docs that no one will ever read, because they are written in a style that makes it hard for a human to comprehend even the first sentence, not to mention the whole concept or tutorial.
And I'm not saying that AI is bad. Many people just over- and misuse it, so here we are, drowning in low-quality output ☹️
That’s a very real concern, and I agree.
The problem often isn’t AI itself, but the uncritical use of it. Speed without thought just scales noise.
I especially like your documentation example, content that is technically generated but not genuinely useful helps no one. Human judgment and readability still matter enormously.
You right! I feel also, lot of us don't spend time to thinking.
This is my concept about the importance of thinking: dev.to/pengeszikra/a-game-for-the-...
That’s a powerful observation, and you’re absolutely right.
Most people don’t lack intelligence; they lack intentional thinking time. In today’s environment, attention gets fragmented, and without protecting it, deep thinking disappears.
Your concept sounds aligned with something important:
→ Thinking is not automatic anymore; it has to be designed and protected.
Very true. Do you have any strategies or processes you use to offset the cognitive costs of AI usage?
Great Point. A few simple habits that can help reduce the cognitive load:
Great suggestions, thank you.
You are welcome.
Agree, our thinking abilities can decline over time. Our mind is like a muscle, if we dont train it, it gets weaker.
Absolutely, that’s a powerful analogy.
Thinking really does behave like a muscle: what we exercise strengthens, what we outsource too much can weaken. That’s why using AI should ideally augment thinking, not replace the practice of it.
I’d push back on this. the thinking isn’t disappearing - it’s shifting layers. less on HOW, more on WHAT to build and whether it matters. different thinking, not less.
Fair pushback, and I agree, it’s more accurate to say thinking is shifting layers, not disappearing.
The center of gravity moves from how to implement toward what to build, why it matters, and how to evaluate it. That’s a different kind of thinking, arguably a higher-order one.
yeah - and the evaluation part is the hardest shift. execution is easy to measure. deciding what actually matters takes a lot longer to get feedback on.
Exactly, that’s the harder layer.
Execution gives immediate signals; evaluation often has delayed feedback, which makes judgment much tougher to develop. And that’s precisely why it becomes a differentiator.
yeah. and when feedback is slow, most people optimize for what's measurable - which is usually execution, not judgment. it self-reinforces.
Yes, it's true in maximum cases.
Nice work.
Totally agree.
I believe due to this haste to build everything and launch rapidly - we are creating a IT Security debt that may cause another 'Great Depression' like that of 1929.
No body is paying any attention to it. Other if there are folks who are signaling for a doomsday - their voices are being drowned under the AI Marketing (much like Snake oil marketing) tactics.** **
You raise an important concern. Speed without security discipline can absolutely create security debt, and that risk is often underestimated in periods of hype.
Innovation and caution have to move together. The answer isn’t slowing progress, but making security and resilience part of the design conversation from the start. Your warning about not letting signal get drowned by marketing is well taken.
Yeah fair point :-)
On dev.to though I do see a lot of articles that express caution and thoughtfulness about how to apply AI 'responsibly' - the awareness is there, I would say ...
But I agree that we shouldn't lose our basic skills by ONLY relying on AI to write all of the code or "do all of the thinking" - we should stay in control, and I'd also say we should even maintain a minimum level of "muscle memory" when it comes to writing code 'manually' ...
That’s a fair observation, and encouraging, honestly.
I agree, there is growing awareness, especially in developer communities, that responsible use matters as much as capability. That kind of caution and thoughtfulness is a healthy sign.
Yeah, because of AI we get a higher price of ram, more data center that have a power source that can power the entire city, more "pure vibe coder", more curator, more operators, more AI generated content, less job because many automation job get replaced by AI. What a year to be alive.
What a year indeed 😄 and that captures the paradox well.
AI is creating new roles, new pressures, new opportunities, and new disruptions all at once. It’s messy, exciting, and a bit chaotic. Every major technology wave has looked like this early on. The challenge is learning how to adapt, not just react.
It all depends on how it's used. I use it to help me write stories, but the plots, settings, and characters are my own.
Exactly, it depends on how it’s used.
Using AI to support expression while keeping the plot, characters, and creative direction your own is a great example of balanced use. The tool can assist, but the storytelling remains yours.
This hits a bit too close. Sometimes I catch myself asking AI too quickly instead of sitting with the problem for a few minutes first.
Very relatable, and honestly, that awareness itself is a good sign.
Sometimes, even a few minutes of sitting with the problem first can change the quality of the questions we bring to AI. Often, the goal isn’t using AI less, but using it a little later.
so true
Thank you
We might not have the choice in the matter. My company actually chose for us, and unfortunately we're asked not to think as much.
That’s a sobering point, and a very real one.
Sometimes this isn’t an individual habit problem, but a system design problem, where incentives push people toward speed over thought. In those environments, preserving even small pockets of independent reasoning becomes valuable.
And ironically, that may make thoughtful people even more important, not less.
I do not disagree with the premise. But if this was written with AI, it undercuts the argument. You cannot critique outsourced thinking while outsourcing the delivery of that critique. That is the contradiction.