Billy, Be HONEST before AI
very very interesting....
Two days ago, I finished a three-day hike along the coastal mountains with two mentor friends—both objectively “successful” by the world’s scoreboard. One owns a public company. The other is an SVP at a Fortune 500.
We covered almost 50 kilometers and 4,000+ meters of elevation gain. Long enough that conversation stops being “catch-up” and becomes something else—cleaner, sharper, less performative.
Near the top of a peak, fog rolled over the ridge, spilling from one side of the mountain to the other like a slow tide. One mentor stared into the distance and said something that didn’t feel like an “AI opinion.” It felt like a rule of reality:
“I like hiking because the trail is honest.
The trail won’t be morally kidnapped.
It won’t treat you differently because you’re old or young.
It won’t give you extra scenery because you’re tired or your knee hurts.
One step—one view.
Extreme honesty gives you extreme experience.
And honesty is the most important thing humans need when facing AI.”
That sentence landed harder than it should have, because it’s not poetic. It’s operational.
AI is a new trail. And it doesn’t care who you are.
The arrogant question vs. the honest one
When people ask, “Will AI have consciousness?” it sounds deep.
But it’s usually a status move.
It assumes:
humans are the default,
humans are special by nature,
AI must earn a seat at the table by becoming “more like us.”
My mentor’s next line flipped the whole posture:
“We ask whether AI will have consciousness.
But the honest question is whether humans have consciousness.”
That isn’t philosophy. It’s an accusation.
Humans are trained on data too (we just hate that framing)
Here’s the line that made the foggy mountaintop feel like a classroom:
“Our parents’ words, the books we read, the people we loved and hated, the things we experienced, the scenes we saw—those are our training data.
Every sentence we speak, every melody we hum, is built from the past.”
It’s uncomfortable because it removes our favorite illusion: that we are original by default.
A lot of what we call “my personality” is inherited patterns.
A lot of what we call “my opinion” is absorbed language.
A lot of what we call “my taste” is social training plus repetition.
From that angle, the gap between humans and AI shrinks fast.
Not to say humans are machines.
But to say: humans are less magical than we pretend.
And if you can’t be honest about that, you’ll never work well with AI—because you’ll keep using it to protect your ego.
The difference that matters: you can choose your inputs
If humans and AI both output what they’ve been fed, then the leverage point is obvious:
What are you feeding yourself?
And what are you repeating until it becomes you?
AI makes this painfully measurable.
You can watch your prompts produce garbage when your thinking is vague.
You can watch your prompts produce clarity when your constraints are sharp.
You can watch yourself try to “win” against the tool instead of using it.
The trail gives you one view per step.
AI gives you one output per prompt.
If you want better outputs, stop bargaining and start training.
That’s the quiet gift here: AI is not just a tool. It’s a mirror that punishes self-deception.
I have been reading tons of stuff that teaches me how to AI.
“Be honest before AI” is the most inspiring one.
Cheers,
Billy




What are you reading regards AI?