The widespread adoption of AI coding tools is leading to an "expectations trap" in the tech industry, where employers are leveraging these tools to demand higher-level work from existing engineers without commensurate compensation, mirroring historical patterns seen with "full-stack" and "DevOps" roles.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
Everyone is an architect now. Everyone
is a staff engineer. You think you're a
staff engineer or not, doesn't matter.
You're an architect, you're a staff
engineer. So the idea is very simple.
Here's what companies are expecting.
What was traditionally considered coding
work is now being done by AI coding
agents, right? They're going to handle
the implementation work. Used to be the
developer's main job. Now the real work,
the stuff that actually matters is the
thinking, right? The architecture, the
context, knowing what to build and where
to put it. So the conclusion is everyone
is now working at an architect level or
a staff engineer level. And the
conclusion seems kind of optimistic. It
says that AI is pushing everyone up the
ladder, right? Junior engineers are
thinking like seniors and seniors are
thinking like staff engineers and
everybody levels up. Sounds great,
right? But here's the problem. I've been
in this industry long enough to
recognize that there is a pattern here,
right? Every time it shows up, this
pattern is dressed a little differently,
but underneath it's always the same
thing. It's about more expectations and
the same paycheck. Let me give you some
history, right? Because this is new.
About 15 years ago, the industry
invented a term called the full stack
engineer. And if if you were around for
it, you'd remember the pitch, right?
Instead of just being a front-end
developer or just a back-end developer,
you could be both, right? You'd be more
versatile. you would be more valuable.
You would understand the whole system.
And that sounds empowering. That sounds
like, well, who would not want to be
more capable? Of course, I want to do
that. But here's what actually happened.
Companies used to take what used to be
two roles, right? A front-end person and
a backend person. And they basically
collapsed it into what? One person now
handles both. So, the scope doubled, but
the salary more or less stayed the same.
It kind of went up a little bit and then
it went back to normal. All right,
that's that's one story. And it happened
again with DevOps, right? There was this
whole you build it, you run it kind of a
thing, right? That was the philosophy.
Developers shouldn't just write code and
throw that over the wall to an
operations team. You should own your
code in production. You should
understand deployment and you should
understand monitoring and
infrastructure. Again, at a high level,
this sounds empowering, right? who
wouldn't want more ownership and more
control. But in practice, it basically
meant that developers were now on call
for production issues. There was a point
of time when developers wouldn't do on
call, right? There were ops people who
would do on call. Now it's become so
normalized that everybody kind of
assumes that yeah, you're a developer,
you're on call, right? you're going to
be managing CI/CD pipelines, you're
going to be writing Terraform config,
and if you're running Kubernetes, you're
going to be debugging the Kubernetes
clusters and all of that on top of your
regular feature work. So basically, it's
more scope again, but the same
compensation. And now we have the AI
version of this pattern. Everyone is a
staff engineer. Everyone is an
architect. You're not just writing code
anymore. You're expected to think
architecturally. You're expected to
manage context across different systems.
You're expected to kind of steer AI
agents to produce the right output. You
have to review everything that it
generates and you have to make strategic
decisions about what to build, is it the
right thing to build and so on. So the
scope has expanded again massively and I
think you can guess what's happening to
the paycheck. So let me tell you what I
think is happening here. So this is not
about AI elevating engineers to a higher
level of work. I mean that's what it
looks like on the surface and maybe for
some individuals it's genuinely that.
And yes AI is going to make people
productive. I'm not going to debate
that. But as an industry trend what's
really happening is something that I'd
call the expectations trap. Right?
Here's how it works. AI tools take a
medium level engineer and it allows him
or her to produce output that looks like
a staff level engineer or an architect
level engineer, right? They can they can
scaffold systems very quickly. They can
use AI to maybe even generate
architectural diagrams. They can produce
code which looks nice and it can work
across multiple systems. The output
looks impressive. So companies look at
that and go great now we can get staff
level output from mid-level engineers
which means they don't have to hire as
many staff engineers or architects right
and they don't need to promote mid-level
people to staff because well they're
already producing at that level say
right so why even promote them so the
point here is if everybody is a staff
engineer and everybody's an architect
well then nobody is a staff engineer and
nobody's an architect so the title
expands s to cover more people but the
compensation does not expand with it.
That's inflation, title inflation,
right? So this is exactly what happened
with the senior engineer role, right? So
101 15 years ago having to say like
you're a senior engineer meant
something, right? It meant that you had
expertise. You were like one step above
a junior and you had something to denote
that step, right? you've been through
maybe a bunch of production incidents
and you kind of know the system well.
You could make some architectural
decisions, right? You don't have to have
as much and you didn't have as much
authority as a staff level or an
architect, but still you did make
architectural decisions within a certain
scope and you would live with the
consequence. You would have ownership of
a certain thing. But now I think
companies just hand out senior titles
after like what two years of experience,
three years of experience. I have
noticed people come out of boot camp
boot camp graduates who had no
experience before they show up as senior
developers on LinkedIn within like one
or two years. Um there is this uh
interesting comment that I read some
time back about um how each of these
titles are going through the same
inflation cycle right it's title
inflation just like currency inflation
is when your money is valued less and
less and you have to pay more money to
get the same value similarly with title
inflation a title has less and less
value and you have to do more work to
justify the impact or the expectations
of that title, right? It's title
inflation. And that's exactly what this
is. We are watching in real time the
architect role or the staff engineer
role go through that same devaluation
that the senior engineer went through a
while ago and I think it is also being
contributed and accelerated by AI. I'm
going to link a post in the description
which is what kind of got me thinking
about this how everybody is a staff
engineer and that post makes one point
which I think is very insightful. It
says that the engineers who succeed and
who thrive won't be the ones who are
best at just prompting AI and getting
the AI to do things that they want it to
do. They will be the ones who can best
manage context. All right, knowing the
code base or understanding the business
domain or kind of carrying the full
picture of what the system does and why,
right? And I completely agree with that.
I've told about that in a bunch of live
streams and I think a couple of videos
as well. I think context is everything.
Even before AI, context is everything.
It's the it's the thing that separates
like someone who just walked into the
team to someone who has kind of been
there and had made decisions and kind of
knows how things are laid out, right?
But the problem that nobody's addressing
is well how does this context come
about? Where does it come from? It comes
from actually doing the work. It comes
from implementing, writing code for many
years and debugging a bunch of stuff,
dealing with production issues, right?
That work is what builds the context
over months and years, right? You build
judgment by actually seeing doing
something and seeing the consequence for
it and not by just handing it off to an
AI and getting it to do the work. So in
that sense there is a bit of a paradox
here. So on the one hand you know you
can say writing code was never the hard
part. Yes AI can write code but that's
not the hard part. It is the strategic
thinking and the architectural vision
that's harder to develop. Right? But by
dismissing that code is the easy part
that makes it easy for companies and
like the CTOs to pay less for it. Right?
Implementation is where you build the
context. Implementation is also what
makes you valuable but implementation is
also what's devalued because hey AI can
do the job right you have to implement
in order to earn the right to make
architectural decisions because you have
lived with a consequence of making bad
architectural decisions. So if we
automate that away for junior
developers, if they skip straight to
staff level thinking without years of
building things, how do they develop
that judgment that makes staff level
thinking valuable in the first place?
There was this recent uh news item about
the AWS CEO putting it very bluntly that
eliminating junior developers is one of
the dumbest things I've ever heard. And
I completely agree with that. Dad even
made a video that eliminating junior
developers is basically companies
shooting themselves in the foot. His
reasoning was that if you don't have
people learning fundamentals today, you
don't have anyone who actually
understands anything 10 years from now,
right? You've optimized for short-term
velocity and you've created a longerterm
knowledge crisis. I actually go beyond
that. I say even forget longer term.
Even in the short term, you're having
problems by not having junior
developers. But let's set that aside. We
we know that this is a problem. The
other thing about everyone being a staff
engineer is the velocity expectation.
I'm seeing this in all the companies of
you know my friends who I've talked to.
This is changing every day. All right?
There is a velocity expectation. AI
doesn't change what you're expected to
do. It just changes how fast you're
expected to do it. I've had people go
like managers say, "Hey, what is the
estimate for this work?" and then you
give the estimate and then they go well
really well just give it to chat GPT or
just give it to claude it's going to get
it done in an hour right think about it
if an AI can generate a working
implementation in minutes something that
a developer would take one day or two
what does it do to your sprint planning
what does it do to your deadlines the
answer is they compress if the tool can
write your code in seconds the
expectation becomes that you ship in
hours what you used to ship in days
right it's not you have more free time.
It's just that you're expected to do
more. And the people that are setting
this expectations like product managers
or engineering heads and like even
executives what they're doing is they're
seeing AI generated demos, right?
They're seeing the before and after for
the productivity charts and they're
saying, "Okay, if the demo could be done
so much in so quick of a time, why are
we not seeing productivity improvement
in our staff?" Right? But here's what
they're not accounting for. The code
generation is faster, sure, but the
thinking hasn't gotten faster, right?
Understanding requirements and making
design trade-offs, reviewing the AI
output that takes so much time, right?
Looking for a subtle bugs and
integrating the generated code into
existing systems. If the AI generates
one-off thing, you have to think like
how does it fit into an existing system?
How do how do you handle edge cases? All
that still takes human time. And in many
cases, it actually takes more time now
because you're reviewing code that you
didn't write and you might not fully
understand. So yes, the productivity
gain is real, but it is smaller than
perceived because there are a few more
things that you need to do. So there is
a gap there and that gap is causing
burnout already in developers and I
think it's going to happen more and more
until the expectations actually get
aligned. Okay, so now I've told you a
bunch of stuff. This is actually what's
happening in the industry now. So what
do you do with all of this? I don't want
to just point to a problem and just
leave it there, right? I want to be
clear. This is not an anti- AI take.
Okay? AI coding tools are genuinely
useful. I use them. I have to say this
in every video because anytime I say
something critical about AI, people are
going to go, "Oh, you don't know. AI is
actually cool." Dude, I use AI, right? I
literally use AI every day and I build
systems with AI. I have a 9 toive job in
a software company and I build with AI,
right? The problem isn't the tools. The
problem is the narrative around the
tools. And the narratives can be pushed
back on. Right? If you are in that
position and you're being asked, like
you're a mid-level developer and you're
being asked to do staff level work
because the AI makes it possible.
Recognize what's happening. It's not
that you're being empowered. You're
being given more scope because there is
this implicit assumption that you should
be doing more. And as long as you don't
meet that assumption that's in someone
else's head, you will be considered as
someone who's not doing up to what you
could be doing. You're not doing up to
the potential, right? So I think it
comes down to communication and setting
expectations. There are a lot of people
who are at jobs where we don't have the
liberty to walk out of because the
market is brutal right now. But I think
it just comes down to how you
communicate what is happening. And if
there are gaps there that your manager
isn't seeing, it's up to you to show
them, right? You cannot just absorb this
extra pressure because it's not
sustainable. The pressure is going to
keep going and you're going to have to
keep delivering. So as much as possible,
the idea is to communicate. Whether it
works or not, it's a different issue.
But you have the opportunity and your
responsibility to yourself to
communicate. You can say, "Well, yes, it
you think it takes an hour, but here's
what it actually took. It took me an
hour to do this, and here were the
problems that I had, and I had to fix
it, right?" Point it out. Imagine if
you're working with a teammate and you
say, "Well, my, you know, this teammate
delivered code in an hour, and then you
pick it up, and then it's a mess and you
had to clean it up." Don't you tell your
manager that? You say, "Well, yes, you
thought the teammate fixed it in an
hour, but it took this long." and you
make basically make the other person
accountable and you say that this is
what you had to do. You're going to have
to do the same thing with AI, right?
Consider AI as your teammate and say if
your manager says, well, AI delivered it
in an hour, you go, well, no, this is
what it delivered and these were the
things that I had to do, right? And also
be skeptical about industry narratives
that that conveniently benefits the
employers, right? Everyone is a staff
engineer. Everyone's got to think
architecturally is a catchy headline,
but it does imply something, right? If
someone tells you that your role has
been elevated, you got to think, okay,
what is happening? Is it just more work
or is it actually an empowering move? If
it's an actually empowering move, sure
you it's it's good for your career, you
should take it. And finally, you got to
think about the long game, right? people
who will be most valued in five or 10
years aren't the ones who are fastest
with generating code with AI because AI
is being commoditized. Everybody's using
AI. So you're going to be competing with
bunch of others who are doing the exact
same thing. The people who shine and
people who have potential for jobs in
the future are people who genuinely
understand systems and who have deep
context because context is what
differentiates and you build context
through years of actually doing the work
and not just directing AI to do it. Sure
you can use AI tools to do it but then
see where it fits. See what is the
understanding behind what was asked of
you. Right? Someone gives you a task
don't just give it to AI. Find out why
it's being given to you and what is the
output of the AI and how it fits into
the picture. Did your AI contribution
assist AI assisted contribution actually
address the need? What was the need and
kind of learn from it? I think that's
the key and there are no shortcuts to
that. I'm going to post the link to this
article in the description. It's it's
worth a read. Um and and I've gone
through bunch of these cycles and uh
it's happened over and over again. full
stack engineering was reasonable, DevOps
was reasonable and they both got to like
we as developers got used to justifying
the expanding scope without expanding
compensation. So the real question is
well sure AI can make you work like a
staff engineer but whether the industry
will pay you for it. If the industry
says well it's you're working like a
staff engineer not because of yourself
but because of the tools we give you
which is AI then you're going to have to
call out well it's not just that I'm
doing staff level thinking which is
different from a junior level thinking
which is a delta which is expected of
you which you're not going to be
compensated for and that's the key right
so let me know in the comments what you
think I'm genuinely curious how uh this
is playing out for you in your company
are you being asked to do more are you
being asked to deliver faster and
saying, "Well, yeah, I can take care of
it." Or are your u managers and senior
leadership a little more practical and
actually testing what's really
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.