Something quietly grotesque is currently happening inside the world’s largest AI labs.
Engineers at OpenAI, Meta, Anthropic, Microsoft, Amazon and Google are treating tokens—the fundamental units of AI computation—the way social-media creators once treated “angry” reactions. They are trying to max them. Deliberately. Publicly. On internal leaderboards that now sit alongside OKRs.
One engineer burned 210 billion tokens in a single week. Another racked up a $150k monthly bill on Claude Code alone. Jensen Huang suggested tokens might soon become “the fourth pillar of compensation.” Companies started handing out six-figure token budgets the way they once handed out RSUs.
On the surface, this looks like the ultimate productivity flex. More tokens = more agents, longer contexts, faster iteration, god-mode leverage. Big Tech can afford it; they own the models, the GPUs, or the volume discounts. The rest of us watch from the cheap seats and assume the giants have simply pulled further ahead.
AI Capex spending is creating a growing gap between the have and have-nots in AI.
The Proxy That Ate the Mission
Remember Meta’s engagement machine?
Facebook didn’t set out to polarize society. It set out to measure something reasonable—time spent, shares, reactions—and then
optimized for it with religious intensity. “Angry” reactions were weighted five times higher than likes because they predicted longer sessions. The algorithm responded exactly as designed: it flooded feeds with outrage, conspiracy, and emotional napalm.
Creators learned the new game within weeks. The platform became objectively more “engaging” and subjectively more toxic. Revenue climbed; trust collapsed; regulators circled. The metric won. The product—and the culture—lost.
Token maxing is the same game, just with different tokens (pun intended).
The stated goal is “productivity.” The measured input is tokens consumed. The moment those two things are linked to status, reviews, and compensation, Goodhart’s Law does what it always does: the proxy stops being a proxy and becomes the target.
Engineers stop asking “What is the smallest, highest-signal prompt that solves the customer problem?” They start asking “How do I burn more tokens today?” The result is the AI-era version of the old “lines of code” (LOC) metric that once produced mountains of unmaintainable spaghetti: bloated contexts, unchecked agent swarms, reams of unreviewed hallucinated code, and a creeping atrophy of human judgment.

Second-order effects appear quickly. Codebases swell with technical debt that no one has time to audit because everyone is busy prompting. Subtle bugs multiply in the noise.
GitHub is seeing this already with massive quarterly growth of their code repositories.
Third-order effects are slower but more lethal: entire engineering cultures begin to equate volume with value. Talent markets shift toward people who are world-class at spending AI rather than steering it. Headcount math flips from “how many engineers do we need?” to “why not cut heads and just buy more tokens?”—exactly the automation-over-humans outcome the labs claim they’re trying to manage responsibly.
Are the giants building a moat based on token access for coders? Yes. But perhaps they are also building a sinkhole and calling it progress.
The Startup Paradox
So what does this all mean to founders and solo builders trying to compete? Or even big companies without the massive AI token budgets?
Big Tech’s insurmountable advantage—unlimited token budgets, agent swarms running 24/7, entire codebases stuffed into 1-million-token contexts—may eventually blunt their edge fastest. They are training their best people to be inefficient.
They are creating a generation of engineers who have never had to ship under constraint. They are optimizing for the appearance of velocity while baking in fragility.
Meanwhile, the resource-constrained—startups, indies, anyone who cannot afford $100k token weeks—are forced to do the opposite. They must become surgically precise. They learn token-efficient prompting, minimal viable agents, hybrid human-AI loops that actually compound. Scarcity breeds signal.

We don’t know who will win in the long run. Will it be the teams that burn the most compute and generate the most output or the teams that extract the most value per token?
History is littered with examples of incumbents who mistook their own waste for strength: newspapers that optimized for pageviews instead of reader trust, platforms that optimized for engagement instead of meaning, retailers that optimized for store traffic instead of profitable customers. In every case the correction was brutal and came from the outside.
The Obvious-in-Hindsight Danger
Tokenmaxxing is not a productivity story. It is an incentive-design story. And the incentive being designed right now is the same one that turned social media into a public-health crisis and software engineering into a lines-of-code arms race.
By publicly celebrating and compensating raw token consumption, Big Tech is proving—out loud and in real time—that AI leverage is already commoditizing. The moment the metric becomes the goal, the game shifts from building defensible products to performing defensible theater. The giants will look unstoppable for 12–24 months. Their output numbers will be insane. Their recruiting decks will be irresistible.
Then the technical debt compounds, the model providers raise prices or throttle, the culture of quantity collides with the reality of quality, and the correction arrives. When it does, the lean, token-disciplined players who never drank the Kool-Aid will be the ones with clean codebases, clear product sense, and genuine speed.

The strategic implication is brutally simple.
If you are at a big tech lab: stop measuring tokens. Measure outcomes that cannot be gamed—customer value shipped, defects introduced, architectural integrity preserved, time to insight on hard problems. Kill the leaderboards before they kill you.
If you are a startup or solo founder: do not envy the token budgets. Your constraint is your unfair advantage. Build the muscle of doing more with less now, while the giants are still high on their own supply. The AI era will not be won by who can spend the most. It will be won by who can waste the least.
The trap is already set. Most people inside it still think it’s a throne.










