Back to Notes

The Single-Use Plastic Era of Software

·4 min read

Most of the AI tools I try don't last a month. I'll find something promising, hit the ceiling in an afternoon, and move on. Someone on X called this the single-use plastic era of software. That's exactly right.

From the user side, the disposability is depressing. You find a tool, learn it, and then either the model provider absorbs the feature or a better version appears. Nothing sticks. From the builder side, the same dynamic is thrilling. Anyone with taste and a problem can assemble the exact thing they need, for almost nothing, in a weekend.

So which is it? A wasteland of plastic wrappers, or a golden age of building?

Both. And the tension between them tells you where the real value is migrating.

The landlord problem

The reason most AI software is disposable isn't that the builders are bad. It's that they're downstream.

If you build on top of a foundation model's API, you're renting the core capability. And the landlord is moving into your apartment. Take Cursor, the AI coding environment, running on models from OpenAI and Anthropic. Great product. But then OpenAI ships Codex. Anthropic ships Claude Code. The model providers see behavior across millions of users, tune model and product together, and own the data flywheel. Cursor sees only its own slice and pays them for the privilege.

Or take OpenClaw. Peter Steinberger built an open-source AI agent that went viral, 145,000 GitHub stars, adopted by companies across Silicon Valley and China. Three weeks later he joined OpenAI to "drive the next generation of personal agents." The project moved to a foundation. The frontier absorbed the frontier-mapper.

This is the default position of almost every AI startup right now. You're arbitraging a capability gap that closes every few months. The moment the lab folds your feature into the base model, you're done.

The cycle repeats at every layer. A capability emerges. Within months it's commoditized. The frontier moves again. Same pattern, shorter loop every time.

What makes this cycle unusually aggressive is that the model providers are pulling data upstream too. OpenAI offers API users free tokens in exchange for opting into data sharing. Your prompts and completions flow back to improve their models. Structurally, the more you build on the API, the more you train the thing that replaces you. And it works, because the models keep getting better at the thing most wrappers sell: convenience. Each capability that gets folded into the base model kills a category of startups that were built on that capability being hard. When the model itself can do the task in one turn, the middleman disappears.

What doesn't decompose

Models leapfrog each other every quarter. The frontier rotates between labs so fast that building around a specific model's edge is like renting a room that changes landlords every season.

The things that survive tend to have something the lab can't replicate with a feature release. Distribution that took years to build. Messy domain data from regulated industries. The kind of relationship where you've become the way someone works, not just a tool they use. Or the trust infrastructure that sits between the model and the real world: evals, safety, compliance.

Everyone else is renting a shrinking gap.

Soil

You can frame all of this as a threat. But zoom out and this loop is the innovation engine. The startups that get crushed aren't wasted. They prove concepts, map user needs, force the labs to build robust versions. Each turn, the next generation of builders starts from a higher baseline.

The plastic decomposes into the soil.

That's the answer to whether this is a wasteland or a golden age. It's both, in sequence, on repeat. The tools are disposable. The learning compounds. And right now, more people can build things than at any point in history, starting from a higher floor than anyone before them, using materials that didn't exist six months ago.

The plastic doesn't last. What grows from it does.