Skip to content
AIPhilosophy 3 min read

The Word "Generate" Is Doing a Lot of Heavy Lifting

In machine learning, "generate" started as something precise. But when the word escaped the lab, those meanings collapsed into each other.

Nilima Garg

Nilima Garg

Dec 22, 2025

The Signal
TL;DR
  • "Generate" meant something modest in machine learning — sample from a distribution. But in plain english it already meant create from nothing, and those meanings collapsed into each other.
  • AI outputs are always implied by the input. Whether human thinking is fundamentally different or just feels that way from inside our own heads is a question nobody has settled.
  • The word matters because it shapes what we expect from these systems — and what we stop expecting from ourselves.

In machine learning, "generate" started as something precise. A generative model produced data points from probability distributions. Sampling. Drawing from the patterns it has already seen and recombines them into something that resembles novelty. The technical term was modest. The model could output but there was no claim of origination. Just: here's what statistically fits based on what came before.

But "generate" already meant something bigger in plain english, to create what didn't exist before. So when the word escaped the lab, somewhere between the research papers and marketing decks those meanings collapsed into each other. Now "generative AI" sounds like machines that think, imagine, create not machines that predict tokens really well to produce outputs.

I don't think this is just a semantic problem. Or maybe it is and I'm making it into something bigger than it needs to be.

Is It Different Though?

Here's what I keep getting stuck on: when a model produces something, it's looking backward. It's finding what fits based on everything it's been trained on. It can extrapolate, remix, hallucinate in ways that feel surprising but there's nothing in the process that reaches beyond what was already there. The output is always, in some sense, implied by the input.

When I think about how I come up with things, it feels different. But I genuinely don't know if it is different, or if it just feels that way because I'm inside my own head and can't see my own process.

Everything I think is shaped by what I've experienced. My brain takes inputs, finds patterns, makes connections. You could argue that's exactly what the model does, just faster and with more data. Maybe the thing I'm calling "origination" is just synthesis I can't trace. Maybe the sense that there's something else happening, something that links experiences in a particular way that isn't just pattern-matching, is an illusion. Something I believe because I can't see my own wiring.

I don't know. To be honest, I want there to be a distinction. I want "synthesis" and "generation" to mean different things not just technically, but actually. I want the human version to actually be different, not just feel different because one happens in my head. But wanting that doesn't make it true, and I can't figure out how I'd know either way from inside my own experience.

So, I guess, I'm just asking. Do you feel this too? That sense that there's a boundary between recombination and origination? Or do you think we just want there to be one?


This post originally appeared on Nilima's Substack, Still Thinking Here. Subscribe if you want more like this.

#AI#Philosophy
Share this article
Nilima Garg

Nilima Garg

Co-Founder, Introve

About Introve

We built Introve to solve the signal loss in hiring. Resumes flatten human potential; we're building the technology to unflatten it.

Mirra

Helps candidates build a verified reflection of their skills.

See the potential
hidden in plain sight.