Understanding the AI Job Erosion Dilemma and Its Impact on Work and Expertise
- Amir Abdelazim
- Feb 23
- 6 min read
There's a noise in the room right now that I haven't heard since the dot-com era. A kind of collective panic dressed up as forward thinking. The narrative is simple and terrifying: AI is coming for digital jobs, the engineering class is next, and if you write code, manage data, or sit behind a screen doing what a model can now do faster — your clock is ticking.
I've been around long enough to know that when everyone agrees this loudly, someone is usually wrong. Let me share where I think the real conversation should be.

1. Jobs vs. Tasks: The Distinction Nobody Wants to Make
Let's start with the most important distinction that's being completely ignored in most boardrooms: the difference between a job and a task.
A task is a unit of work. A job is a bundle of judgment, context, relationship, and accountability wrapped around many tasks. When AI replaces a task — writing a first draft, generating a report, translating a document — it doesn't automatically replace a job. It changes what the job looks like.
Here's the uncomfortable truth: a lot of what we defined as "digital work" over the last two decades was actually just digitised administration. We took paper processes, made them electronic, and called it transformation. We built massive ERP systems that required armies of IT consultants, integration specialists, and data entry staff. We replaced 100 people managing procure-to-pay with 100 different people managing the systems that replaced them. The system is now correcting itself. AI is not destroying digital jobs — it's exposing how many of those jobs were never really about thinking in the first place.
The jobs being eroded are the ones built on repetition, not reasoning. And that's been true of every major technology wave in history.
2. Skills Matrix: Has What We Need Actually Changed?
I keep hearing that the skills matrix has been rewritten. I'm not sure I agree — at least not in the way most people mean.
Think about what makes a great engineer. Is it the ability to memorise equations? To recall the physical laws governing a system? Or is it the capacity to look at a broken system, understand what it's really telling you, and design a solution that didn't exist before?
Think about what makes a great military leader — a sergeant, a commander. Is it the cut of their uniform? The ability to recite field manuals? Or is it the instinct developed through exposure to chaos, the trust built under fire, the judgment to know when the manual is wrong?
What we're discovering is something we always knew but never had to defend: attitude, curiosity, and creativity are not soft skills. They are the core skills. They always were. AI doesn't change that — it just strips away the camouflage of busyness that used to hide whether someone actually had them.
The skills matrix hasn't fundamentally changed. What's changed is that AI now does the low-level work fast enough to expose whether you were adding genuine value on top of it. Hire for curiosity. Hire for critical thinking. Hire for the ability to ask the question the machine hasn't been trained to ask. That was always the right approach.
3. The Expertise Problem: Will the Next Generation Get the Learning Curve?
This is the question that genuinely keeps me up at night — and I don't hear it discussed enough.
Expertise is not a destination. It's a journey. Every expert I've ever respected earned that status through a specific arc: entry-level exposure, repeated mistake-making, pattern recognition, and eventually — intuition. That last part matters more than people admit. When a senior engineer looks at a network diagram and says "something feels off" before running a single diagnostic — that's not magic. That's ten thousand hours of compressed experience firing at once.
Here's the dilemma: if AI handles the entry-level tasks — the first drafts, the basic debugging, the initial data pulls — where does the next generation build that foundation? How do you develop intuition if you've never had to grind through the work that builds it?
We did something similar with GPS. A generation of people now cannot read a map. We didn't just automate navigation — we atrophied the skill. If we're not careful, we will do the same thing to technical judgment at scale.
The answer is not to ban AI from junior workflows. That's not realistic. The answer
is intentional exposure design — deliberately creating the conditions where emerging professionals still have to think, fail, reconstruct, and own problems. The learning curve must be preserved, even if the route through it looks different.
4. The Self-Correction Nobody Wants to Admit
Here's the part that's going to sting a little.
A significant chunk of the digital job market we built over the last two decades was man-made complexity. We decided to build ERP systems of staggering complexity. We created digital transformation programmes that consumed more budget than the revenue they were meant to protect. We added layers of integration, middleware, security tooling, and compliance frameworks — many of which were solving problems that better architecture could have avoided in the first place.
We replaced 100 people who understood the business with 100 different people who understood the system — and called that progress. Now AI can handle a meaningful portion of what those 100 system people were doing. And the industry is reacting with shock.
It shouldn't be shocking. Markets correct. Efficiency eventually wins. The digital economy is returning to a more honest value equation. The question is not whether this correction was coming — it was. The question is whether the people in these roles prepared for it, and whether the organisations employing them designed careers with enough depth to survive a technological shift.
5. The AI Bubble: Yes, We're In One. No, It Doesn't Mean What You Think.
Yes, we are in an AI bubble. Let me say that clearly, without anxiety and without apology.

Think about the dot-com era. A hundred horses in a race. The noise was deafening. The valuations were insane. Promises were made that physics and economics could never keep. And then the dust settled. A handful of companies — Google, Amazon, Salesforce — became the infrastructure of the modern economy. The rest? Gone or irrelevant.
The same logic applies now. Of the hundreds of AI startups burning through capital today, roughly ten will become dominant. Ten will die quickly. Eighty will hustle in the middle, some pivoting into niches, some becoming acquisition targets, some simply fading. Some investments will go to zero. The noise will be painful.
But here is what's different from dot-com: the underlying value chain of AI is real, immediate, and physically anchored.
The energy companies powering data centres are not going away. The chipset manufacturers — particularly in AI accelerators — are not speculation, they are infrastructure. Agentic AI — systems that actually execute tasks rather than just suggest — is maturing fast and the business cases are real. Physical AI, the intersection of robotics and intelligence, is beginning to reshape manufacturing, logistics, and healthcare in ways that are not theoretical.
The bubble is in valuations and expectations. The foundation is real.
6. What to Actually Do About It
If you're a leader, here's my honest advice:
Stop reading the wave and start positioning in it. The value chain that survives the bubble has clear layers: energy and compute infrastructure, chipsets and hardware, agentic AI systems, and physical AI. Understand where your organisation sits relative to those layers. Are you a consumer of AI? A builder of AI systems? A provider of the infrastructure AI runs on? Each position requires a different response.
Build people who manage AI, not people who are managed by it. The most dangerous future is one where your team can't function without AI and can't question what AI produces. You want people who use these tools with authority — who know enough to catch the model when it's confidently wrong, and who have enough real-world judgment to know when to ignore it.
Protect the learning curve. Don't automate junior roles out of existence — redesign them. Create deliberate conditions for people to develop expertise, even if the entry point looks different. The intuition you'll need in five years is being built right now, or not being built. That's your choice.
And finally — stop treating this as an IT decision. The AI era is a civilisational shift on the scale of agriculture. The companies and leaders who thrive will be the ones who understood that early enough to act on it, not just discuss it.
The dust will settle. Make sure you're on the right side of it when it does.





Comments