I was at an event this morning where someone said something that really resonated with me. "The new programming language is English." And you know ...
For further actions, you may consider blocking this person and/or reporting abuse
You mention Apple a lot in the post. You know Apple is the company that spends the least on AI of all the big tech companies.
If you leave agents out of the equation, another reason Apple launched the Neo might be because it sees Windows is struggling and they want to attract more customers. The most efficient way to attract customers is by lowering the price.
Sure they add AI as a part of marketing, because it is the hot new thing in tech. Not including it would be dumb, and people at Apple aren't dumb.
Using agents as the only reason people are not going to buy machines with more power than they need is like looking at the world with blinkers on. The dumb terminal is a story tech is selling since they introduced the cloud. That has been going on for a while now, and bigger and better machines were getting build and bought.
Thanks for the comments.
I mention Apple, because I use a Mac and have for many years, but it would be the same for any PC/Laptop
I don't think it is the only reason, but from my personal experience and what I see from interacting with developers on a regular basis, a good amount of the work is moving to simple chatbot/slack/other interactions.
There will always be a need for personal computers for certain use cases, I thing many of them can be covered, by a personal Agent that runs "somewhere"
These are two different cases. One is a work scenario, and the other is a private scenario.
In both cases sensitive data is the thing why people should not let an agent know everything.
That is why computers have passwords, biometric security, hard disk security, and that are just low security computers.
For all that security there is a penalty and that are higher end specifications.
And with the rise of AI generated malware and attacks that security level is only going to go up.
Another trend that is requiring more powerful hardware is the fattening of the operating systems and applications. Most app developers only consider Electron as a base, that is at least 100 MB just to get started.
Yesterday I read chrome adds an AI image of 4GB , when you have 128 GB of storage that is a big chunk you can't use for your own data.
The thing AI providers don't seem to acknowledge is that running models on personal computers is going to become mainstream. I don't know when, but the models that can do the things users want are getting smaller and less memory heavy as time goes by.
And when that moment happens people are going to want a personal computer and not a dumb terminal.
This is exactly the point. The more we can do locally and privately, the more useful this will be. While it resides solely in the "cloud" its use is limited to things that people don't mind "sharing" with the world.
I work on a VPS a 1.5 year an my experience is terrible. A slow open time and a low FPS around 16 is completly destroy any computer handling experience. Now I use a 5 year old dell laptop with windows as my work machine that is also not a best, just a wsl install ubuntu 20.4 save my mind to force to work on windows.
On home I have MacBookPro M1 with touchbar, that is the perfect computer for most of the work. Touch Pad is better than any mouse. Lightweight and long last power time, even do not need a desk for that use. So I think MacBookPro is worth their price.
the casualty framing is sticky but i think the article is mostly right for general office work and mostly wrong for the audience reading it on Dev.to. for engineers, the MacBook does things a VPS structurally can't: running large local datasets without round tripping every byte, debugging at full IDE speed without latency, working through a flight or a connectivity outage. those aren't edge cases, they're a daily fraction of the work. the VPS replaces laptop math also assumes the agent layer survives, but the failure surface there is invisible until it fires (account ban, api deprecation, country firewall, vendor pivot). honestly i think the next casualty is more accurately "low-end personal computers as a category" rather than personal computers full stop.
You just need a tablet or a MacBook Air unless you're a developer running Docker or a local LLM on your machine. Nothing new. Oh, and weren't BASIC, COBOL, and SQL all supposed to make English the new programming language? I must be getting old.
The personal-computer-as-casualty framing is interesting but I think it lands a year or two early. The blocker is not the agent on the cloud, it is the per-user cost curve. I am running five concurrent AI subscriptions right now (about $147/month, every month) and the heaviest workloads still happen on my laptop because pushing them up to a cloud agent is slower and more expensive than I expected. The PC becomes the casualty when the agent gets cheap enough that I stop opening my own editor at all. We are not there yet. We are at the awkward middle stage where I pay for both and use both.
Interesting perspective. I think the biggest shift is that GenAI is not just changing how we build software, but also what parts of the old workflow still deserve human effort.
The risk is assuming every manual step should disappear. Some should. But some steps carry context, judgment, and accountability that automation still does not fully replace.
The casualty framing lands because it's not really about jobs vanishing, it's about which skills compound and which get devalued. The skills that hold value are the ones AI can't replicate from public training data: the why behind a team's specific trade-offs, the historical accidents that shaped the system, the rejected paths nobody documents. That's exactly the layer we're working on at Mneme.
dev.to/valsaven/comment/3292g check it out
I think the smartphones have also had their fare share of impact on the PCs. But AI are definitely exaggerating this.