Skip to content
AIHiringCulture 5 min read

Technical (adj.): Finally Got With the Times

What do people really mean when they say someone isn't technical?

Nilima Garg

Nilima Garg

Jan 11, 2026

The Signal
TL;DR
  • "Technical" has been compressed to mean "writes code" or "uses AI tools" — but subject matter expertise, domain thinking, and tool proficiency are three different things getting collapsed into one.
  • Coding courses and AI tools don't make someone more technical. They give people a way to execute on expertise they already have.
  • LLMs run on language. Telling language-skilled people they need to become more like programmers misses that their understanding of meaning and context is exactly what differentiates good AI implementation.

After my layoff from Spotify, I found myself doomscrolling through LinkedIn pretty much daily. And woof, that place is… something. I know I just wrote about the "using AI for content" thing, but something else has bothered me long before people started posting "Here's what getting hit by a bus taught me about B2B sales" or whatever. It's the word "technical" - it feels like it's gotten compressed to mean "writes code" or "uses AI tools". To sound exactly like a 2010s college application essay: Merriam-Webster defines technical as "marked by or characteristic of specialization" or "of or relating to a particular subject." Sometimes I even forget that technology spans the wheel to supercomputers. Words have meaning. I know they adjust with time but this one feels like it still is supposed to mean that.

Three Things Getting Collapsed

I think there's three things getting collapsed:

Subject matter expertise: an IP lawyer knows different case law than a criminal defense lawyer. Different topics, regulations, precedents, etc.

Domain expertise: the way lawyers think. How they structure arguments, anticipate counterarguments, read specific language and see implications. This is the stuff you couldn't just walk into and do. I'm not a lawyer so I don't actually know the full scope of it, but those seems like skills that require training and experience and aren't just… knowing facts.

Tool proficiency: using Westlaw, legal research databases, or now even some AI tools for drafting briefs.

When people say "technical" now, it feels like they mostly mean tool proficiency. Specifically, tool proficiency with computers. More specifically, using AI tools.

The Thing That Bothers Me

I saw this post today, from a while ago, that included a screenshot from an Andrew Ng, founder of DeepLearning.AI, LinkedIn post. Andrew, someone who knows infinitely more than me, wrote: "Among people in non-technical roles (recruiter, marketer, sales...) I notice the more technical ones being more effective, and the gap is increasing. E.g., the ones that took a coding course are outperforming the ones that didn't.

One obvious theory is that they are better at using AI..."

I'm not saying he's got the wrong take. If fact I think it aligns with the larger point I'm trying to get at. My beef is more with how we use the word "technical" now. He, like many (honestly including me), is calling roles like recruiters, marketers, and salespeople "non-technical."

But, is analyzing market trends not a technical skill? Is evaluating candidates, trying to interpret if they're actually knowledgable, if they'd be a fit for a role, not technical? What is technical, just writing JavaScript?

The people in these roles already have cognitive skills from their experiences. Coding courses or using AI doesn't make someone more "technical". They give them a way to execute on their existing expertise rapidly and at scale. Using AI tools is just an implementation. Not where the skill lives.

The Technology Itself Runs on Language

I think why this feels meaningful to me is because LLMs are Large Language Models. They work with language for sophisticated pattern matching (I'm not an AI researcher, this is just my understanding). The choices about what data gets used, how it's weighted, how you evaluate whether output is good; those are decisions where understanding language, meaning, and context would matter.

The research labs building foundation models seem to agree. They have linguists and cognitive scientists on staff. But the companies implementing AI into their products are saying "we need people who can talk to computers." Hiring for "technical" AI skills. When what might actually differentiate their implementation is also having more people who better understand how humans communicate, what context matters, how meaning works.

It feels like they are telling people trained in domains that require deep language understanding, "you need to become more technical" (meaning: function more like a programmer), when the technology they want to integrate itself runs on language.

The Pattern

That is the pattern that bothers me most. I constantly see messaging that people need to learn coding, they NEED to learn how to use AI tools, they need to "get with the times." And if we tell everyone they need to function "the way everyone else is", we end up with everyone thinking the same way. Innovation stalls, not in some hyperbolic "we're destroying everything" sense, just by adding a ton of unnecessary hurdles. I get it, it's easier to say "if you do X you will succeed" than to acknowledge that reality isn't so black and white. That different ways of thinking aren't deficits and what actually makes a difference is diversity of thought.

I don't have a neat conclusion here. I'm just noticing that every time there's a new technology, it feels like we collectively decide there's one right way to engage with it. "You're behind, go learn this thing" rather than "you have skills, here's a tool that might help you apply them."

I don't think learning new tools is wrong, in fact, I'm a huge proponent of expanding outside your comfort zone. But the framing matters. "You're deficient, you must use AI" is different from "you already know how to think, and now there's a new way to apply and expand that."

I wonder what we're missing by not seeing it that way.


This post originally appeared on Nilima's Substack, Still Thinking Here. Subscribe if you want more like this.

#AI#Hiring#Culture
Share this article
Nilima Garg

Nilima Garg

Co-Founder, Introve

About Introve

We built Introve to solve the signal loss in hiring. Resumes flatten human potential; we're building the technology to unflatten it.

Mirra

Helps candidates build a verified reflection of their skills.

See the potential
hidden in plain sight.