When AI flattens the room

Meeting transcription AI captures everything equally.

Someone says “I think this is important” and the AI treats it as neutral text. But that statement from the CEO carries different weight than the same words from an intern.

The AI misses the context. It flattens organizational dynamics in the summary.

If that summary becomes the canonical record, suddenly the quiet person who made one good point gets equal billing with whoever dominated the conversation.

Nobody designed for that. But it’s happening.

These tools are conducting unplanned experiments in communication patterns and social dynamics, with outcomes nobody anticipated.

Even bad AI changes systems

Go into Google Drive. Try to find the document you edited yesterday. It’s like it vanished.

That’s terrible AI from a search company. But it already changed how you work.

You probably search differently now. Use different keywords. Rely more on recent files. You adapted to its limitations.

The AI didn’t get smarter. Your system changed.

This is what we miss when we obsess over how intelligent AI is. The transformative power isn’t in the intelligence level. It’s in how AI changes the systems we use every day.

Even dumb AI reshapes behavior.

The stupidity filter

Claude and ChatGPT changed how I think.

I now vet ideas there first for any upfront stupidity.

That’s a new step in my thinking process. I’ve essentially added a rubber duck that talks back. It probably changes not just what ideas I present to people, but how I formulate ideas in the first place.

Some of the best insights come from someone saying something half-baked and another person going “wait, that’s actually interesting if we think about it this way…”

But if you’ve already filtered out the half-baked idea through AI, those generative moments never happen.

You get more polished inputs. But you lose those weird random leaps that lead somewhere unexpected.

The trick is knowing when to use the filter and when to let the half-baked ideas fly.

Escape the Third-Party API Trap

What I really like about Lovable’s new moves is that they’re pivoting toward a model where they no longer fully depend on third-party APIs.

I’ve seen this dependency problem destroy value before. At memmo.me, our entire business depended on celebrities creating content. One celebrity deciding to stop could hurt our platform. At Spotify, I watched Taylor Swift pull her entire catalog to go exclusive elsewhere. Not business-ending, but a clear reminder of who held the real power at the time.

Both situations taught me the same lesson: when your business model depends on someone else’s decisions, you’re never fully in control.

Most third-party AI platforms face this exact squeeze today. Every improvement costs more tokens. Every feature burns more cash. When you’re optimizing for token efficiency instead of user experience, you’ve already lost the game.

Lovable just escaped that trap.

They raised $200M and launched infrastructure features like hosting, security, and team collaboration. Where their previous improvements often depended on more intricate ways of prompting and cost more and more tokens, now they’re creating revenue streams that don’t burn API tokens.

This isn’t just about adding features. It’s about changing what business they’re in.

They’re not competing with OpenAI, Anthropic, and Google on pure generation anymore. They’re becoming the place where AI-generated apps live. That’s the infrastructure move that changes everything.

Now when you build something with Lovable that uses their new offerings, moving it elsewhere means migrating your database, your custom domain, your team workspace. Possible, but painful.

The pattern repeats across industries. Record stores couldn’t compete when big box retailers sold CDs as loss leaders. Travel agencies disappeared when airlines cut commissions. Cable companies survived by becoming ISPs instead of fighting Netflix for content.

Stop being the middleman. Start owning the infrastructure.

Every successful platform eventually faces this choice. You either own something irreplaceable or you become a feature that someone else can build cheaper.

When costs rise and margins shrink, you either control something customers can’t easily leave, or you watch your business model collapse one price increase at a time.

Lovable chose ownership.

Other AI platforms will need to make similar moves soon. Own the deployment pipeline. Control the data layer. Become infrastructure that’s harder to replace than the generation itself.

The intermediary becomes the infrastructure, or the intermediary dies.

That’s the choice.

When people share unfiltered AI, my brain switches off

During the last couple of years we have seen plenty of presentations and workshops where the presenter says: “I’ve let AI summarize what we’ve done so far.”

I can hear the audience’s brains collectively shut off.

They don’t want to read long unedited AI sloppified versions of what has been said.

It’s not about AI being bad. It’s about someone abdicating their thinking.

When people share what AI has written without putting their touch to it, they’re essentially saying “I didn’t think this was worth my time to understand, but maybe it’s worth yours.”

If you didn’t digest it, why should I?

Personal stories trump tools

Here’s what I’ve noticed: when I share a personal story or genuine viewpoint, people don’t care if AI helped me write it.

The authenticity of the experience matters more than the authenticity of the prose.

You can’t fake living through something. You can get help explaining what you lived through.

There’s a difference between using AI to find better words for your thoughts and using AI to have thoughts for you.

The story has to be yours. The telling can be collaborative.

Stand behind everything you publish

I rarely share what AI writes directly.

It will always be my responsibility that I can stand behind the text, code, or image.

AI can help me articulate ideas. It cannot give me ideas I don’t actually have.

The tool doesn’t matter. The accountability does.

If you can’t defend it, don’t publish it.

I’m writing about AI while using AI to write about it

Meta? Sure.

But here’s what just happened: I had a conversation with Claude about my writing process. We found three post ideas. Claude helped me draft them. I edited them down.

Now I’m writing about that process while doing that process.

The recursion doesn’t make it less real. It makes it more honest.

This is how collaboration works now.

AI is making me a better editor

Everyone worries AI will make us lazy writers.

The opposite is happening to me.

AI wants to write everything. Long paragraphs. Perfect transitions. Neat conclusions that wrap everything up with a bow.

I spend most of my time cutting. Removing words I don’t like. Cleaning up endings that over-explain the point.

The creative work has shifted. It’s not about generating anymore, it’s about recognizing what matters and ruthlessly eliminating what doesn’t.

I’m forced to practice saying “no” to perfectly decent sentences because they’re not essential.

I’m learning how to distill without losing the core.

How to be brief without being empty.

Small steps with AI

As a junior developer, I wanted to rewrite everything. Big commits, massive refactors, complete overhauls. It felt productive.

It wasn’t.

Senior developers know better. Small commits. Testable changes. Break it down further.

As a CTO, I coach the same thing. What’s the smallest thing we can release? Can we make a PoC first? How do we break this down?

Now I’m watching people learn this same lesson with AI.

AI wants to write your entire blog post, refactor your whole function, solve your complete problem. Just like junior me, it feels productive.

It isn’t.

The people who get the most out of AI are treating it like code. Small changes they can verify. Iterative improvements they can stand behind.

They’re learning in months what took me years: the more powerful your tool, the more restraint you need to use it well.

Turns out incremental thinking isn’t just good engineering. It’s good everything.