Make things, see better
When you create things, your consumption gets better.
When you create things, your consumption gets better.
A few years ago I tried to teach my kids how to play chess. Trying to learn them all the moves and how to think ahead. The concept of “where do you want to be in a few moves?” was somewhat complicated to grasp.
But once you understand it, everything changes. The better you are at predicting where you want your opponent in the future, the better you are at moving towards it. But you can’t just jump straight to checkmate. You need to plan the moves in between.
The same applies to developing using AI.
You need to think iteratively. Where do you want to move your pieces? In which direction?
The difference with chess is that your opponent really wants you to win. It can eagerly move its pawns to where it thinks would benefit you.
But it might not.
Just like chess, the more experience you have and the better your skills, the better player you are. The same applies to developing using AI. The more skills you have in development and thinking in systems, the better you’ll be at guiding it.
We invented chess computers that can think ahead. Building AI that can do iterative thinking is just the next step. So expect improvements but maybe not exact results.
About 10 years ago an agile coach introduced me to this concept: meeting theater.
He was talking about scrum ceremonies. Teams going through daily standups, sprint reviews, and retrospectives because the framework said so. Not because it moved anything forward. A fake display of productivity with no progress.
Throughout my career I’ve been watching organizations that struggle to ship.
Management teams spend 8 hours a day in meetings. The calendar looks impressive, packed with important-sounding work.
But nothing gets built.
Meeting theater feels productive. That’s the trap. But those decisions need execution, and execution needs uninterrupted time.
The busy schedule becomes proof of importance. The full calendar signals commitment.
Someone has to write the code, ship the product. That someone isn’t in meetings all day.
The real work happens when people can focus. When they can solve problems instead of talking about solving problems.
But talking about work feels like work.
So the theater continues.
People are already modifying how they communicate because they know AI is listening.
In meetings, they use clearer, more declarative language because they’ve learned the AI captures that better. In writing, they structure thoughts differently for AI consumption.
We’re unconsciously adapting our natural communication patterns to be more “AI-friendly.”
The intelligence level of the AI almost doesn’t matter. Once these tools enter our workflows, they inevitably alter the fundamental patterns of how we interact with information and each other.
We’re not just using AI. We’re being shaped by it.
Meeting transcription AI captures everything equally.
Someone says “I think this is important” and the AI treats it as neutral text. But that statement from the CEO carries different weight than the same words from an intern.
The AI misses the context. It flattens organizational dynamics in the summary.
If that summary becomes the canonical record, suddenly the quiet person who made one good point gets equal billing with whoever dominated the conversation.
Nobody designed for that. But it’s happening.
These tools are conducting unplanned experiments in communication patterns and social dynamics, with outcomes nobody anticipated.
Go into Google Drive. Try to find the document you edited yesterday. It’s like it vanished.
That’s terrible AI from a search company. But it already changed how you work.
You probably search differently now. Use different keywords. Rely more on recent files. You adapted to its limitations.
The AI didn’t get smarter. Your system changed.
This is what we miss when we obsess over how intelligent AI is. The transformative power isn’t in the intelligence level. It’s in how AI changes the systems we use every day.
Even dumb AI reshapes behavior.
Claude and ChatGPT changed how I think.
I now vet ideas there first for any upfront stupidity.
That’s a new step in my thinking process. I’ve essentially added a rubber duck that talks back. It probably changes not just what ideas I present to people, but how I formulate ideas in the first place.
Some of the best insights come from someone saying something half-baked and another person going “wait, that’s actually interesting if we think about it this way…”
But if you’ve already filtered out the half-baked idea through AI, those generative moments never happen.
You get more polished inputs. But you lose those weird random leaps that lead somewhere unexpected.
The trick is knowing when to use the filter and when to let the half-baked ideas fly.
What I really like about Lovable’s new moves is that they’re pivoting toward a model where they no longer fully depend on third-party APIs.
I’ve seen this dependency problem destroy value before. At memmo.me, our entire business depended on celebrities creating content. One celebrity deciding to stop could hurt our platform. At Spotify, I watched Taylor Swift pull her entire catalog to go exclusive elsewhere. Not business-ending, but a clear reminder of who held the real power at the time.
Both situations taught me the same lesson: when your business model depends on someone else’s decisions, you’re never fully in control.
Most third-party AI platforms face this exact squeeze today. Every improvement costs more tokens. Every feature burns more cash. When you’re optimizing for token efficiency instead of user experience, you’ve already lost the game.
Lovable just escaped that trap.
They raised $200M and launched infrastructure features like hosting, security, and team collaboration. Where their previous improvements often depended on more intricate ways of prompting and cost more and more tokens, now they’re creating revenue streams that don’t burn API tokens.
This isn’t just about adding features. It’s about changing what business they’re in.
They’re not competing with OpenAI, Anthropic, and Google on pure generation anymore. They’re becoming the place where AI-generated apps live. That’s the infrastructure move that changes everything.
Now when you build something with Lovable that uses their new offerings, moving it elsewhere means migrating your database, your custom domain, your team workspace. Possible, but painful.
The pattern repeats across industries. Record stores couldn’t compete when big box retailers sold CDs as loss leaders. Travel agencies disappeared when airlines cut commissions. Cable companies survived by becoming ISPs instead of fighting Netflix for content.
Stop being the middleman. Start owning the infrastructure.
Every successful platform eventually faces this choice. You either own something irreplaceable or you become a feature that someone else can build cheaper.
When costs rise and margins shrink, you either control something customers can’t easily leave, or you watch your business model collapse one price increase at a time.
Lovable chose ownership.
Other AI platforms will need to make similar moves soon. Own the deployment pipeline. Control the data layer. Become infrastructure that’s harder to replace than the generation itself.
The intermediary becomes the infrastructure, or the intermediary dies.
That’s the choice.
During the last couple of years we have seen plenty of presentations and workshops where the presenter says: “I’ve let AI summarize what we’ve done so far.”
I can hear the audience’s brains collectively shut off.
They don’t want to read long unedited AI sloppified versions of what has been said.
It’s not about AI being bad. It’s about someone abdicating their thinking.
When people share what AI has written without putting their touch to it, they’re essentially saying “I didn’t think this was worth my time to understand, but maybe it’s worth yours.”
If you didn’t digest it, why should I?
Here’s what I’ve noticed: when I share a personal story or genuine viewpoint, people don’t care if AI helped me write it.
The authenticity of the experience matters more than the authenticity of the prose.
You can’t fake living through something. You can get help explaining what you lived through.
There’s a difference between using AI to find better words for your thoughts and using AI to have thoughts for you.
The story has to be yours. The telling can be collaborative.