The industry spent years telling us "AI will augment you, not replace you." Then 78,000 tech workers got laid off in Q1 2026, with nearly half of those cuts directly attributed to AI reducing the need for human workers.
That gap — between the messaging and the reality — is where the mental health crisis lives.
The leadership vacuum
When everything changes at once, you look up for direction. How should I be thinking about this? What skills should I build? Is my job safe?
The honest answer from most industry leaders right now is silence. Or worse — vague platitudes about "embracing change" while quietly restructuring teams.
There's no playbook for this. The executives are figuring it out in real-time just like everyone else. But that silence compounds the anxiety. When you're underwater, you want someone to tell you which way is up. Instead, you get "we're all learning together."
That's not leadership. And the absence of it is felt across the industry.
FOBO is real
It has a name now: FOBO — Fear of Becoming Obsolete. The creeping sense that your skills are degrading faster than you can upskill. That you're falling behind even while working harder than ever.
This isn't imposter syndrome. Imposter syndrome is feeling like a fraud when you're actually competent. FOBO is watching the ground shift under your feet in real-time. Skills you spent years mastering — the kind that used to take a decade to develop — are now getting generated in one prompt.
24% of employees report worsened mental health from AI-related information overload. 23% report a reduced sense of control over their future.
And unlike previous layoff cycles where you could dust off the CV and find a similar role somewhere else, many of the roles being eliminated are disappearing across the entire industry simultaneously. There's nowhere to go sideways.
Your flow state is under attack
Here's what nobody talks about enough: the thing that made engineering enjoyable — the deep focus, the zone, the craft of building something complex — is getting harder to access.
Research shows developers now spend 51.5% of their coding time in "LLM interaction states" — verifying suggestions, crafting prompts, deciding whether to accept or reject. Every AI suggestion is a micro-context-switch.
The disruption is worst during implementation — 32.7% disruption rate during the actual building, compared to just 7% during debugging. The deep work gets interrupted the most.
For some people, this is fine. They adapt quickly, and the AI becomes an extension of how they think. For others, every AI assist becomes a fix-it task — slowing momentum, breaking flow, eroding confidence in their own abilities.
One controlled study found AI tooling actually increased completion time by 19% for certain developers. It's not universally making us faster. But the expectation that it should be making us faster? That adds its own pressure.
Everyone has an opinion now
The barrier to entry has dropped dramatically. PMs, founders, stakeholders — people who never touched code before — can now generate working prototypes and have informed opinions on implementation.
This can feel threatening if your identity is wrapped up in being "the person who can build things." Suddenly everyone can build things. Or at least, everyone can build something.
But this is actually a growth area if you reframe it.
75% of non-technical staff can now contribute to solution creation. That's not a threat to engineers — it's a massive expansion of who you can collaborate with. The PM who can prototype their own idea before the spec review? That's a better starting conversation.
The value shifts to what happens after the prototype. Can it scale? Is it secure? Will it be maintainable in six months? The gap between "it works" and "it works well, at scale, safely" is where engineering expertise still matters enormously.
If you can bridge that gap — and help non-technical collaborators understand why their working prototype needs architectural changes — you become more valuable, not less.
Smaller teams, higher stakes
The industry is consolidating around smaller, more empowered teams. Gartner forecasts smaller core engineering teams augmented by AI, producing 2-5x the output of previous team structures.
Fewer people. More responsibility per person. The safety net of large teams — where you could specialise narrowly and rely on others to fill the gaps — is disappearing.
Low-performing teams using AI improved 4x more than already-high-performing teams. The tools amplify whatever system they're dropped into. If your team has good fundamentals — clear ownership, solid review practices, strong communication — AI makes you fly. If your team was already struggling, AI accelerates the chaos.
The pressure on individuals within these smaller teams is real. There's less room to hide, less buffer when someone's having a rough week, higher stakes on every decision.
What you can actually control
I'm not going to pretend this is easy or that there's a simple framework that makes the anxiety disappear. But there are things that help.
Specialise in the change itself. The engineers who'll come out ahead are the ones who get fluent in the new tools faster than the industry average. Not just using Copilot or Claude Code — actually understanding the patterns, the limitations, where they accelerate and where they mislead. Become the person on your team who knows how to wield these things well.
Separate what you can control from what you can't. You can't control the market. You can't control whether your company decides to restructure. You can't control the pace of AI advancement. But you can control your skills, your adaptability, your willingness to learn. Pour your energy into the things you can actually affect.
Become the bridge between business and execution. Code is cheap now. Understanding what to build — translating a business problem into the right technical solution — that's where the value concentrates. The closer you are to understanding the why behind what you're building, the more durable your position.
Rapid experimentation. Keep expanding your toolkit. Try new tools aggressively, even if they feel threatening. The more you understand what's possible, the better your judgement becomes. Curiosity is a competitive advantage right now.
Resilience and mastery still pay off
There's a version of this story where the conclusion is doom and gloom. Everything's changing, nothing's safe, good luck out there.
That's not how I see it.
The fundamentals haven't changed as much as the noise suggests. Understanding systems deeply, thinking clearly under pressure, building things that actually work at scale — these skills still compound over time.
What's different is that the path to mastery runs through new terrain now. The engineers who'll thrive aren't the ones pretending nothing's changed. They're the ones building new muscle while the ground shifts. Getting good at the new tools. Developing judgement about when AI helps and when it hinders. Staying close to the business problems, not just the technical implementation.
Resilience isn't about being unaffected by change. It's about having enough foundation that you can adapt without falling over. And mastery — real mastery — has always been about continuous learning.
That hasn't changed. If anything, it matters more now than ever.
Comments