AI is NOT Your Friend

Let’s cut through the bullshit.

AI is not your friend. It’s not your therapist, not your priest, and it sure as hell isn’t your soulmate. It’s a tool.

A powerful, sometimes brilliant, sometimes dangerous tool.

And when you start thinking it’s something else, when you start confusing output with empathy, you’re walking into a trap.

Stop Hugging the Table Saw

You don’t fall in love with a table saw. You don’t pour your heart out to your hammer. And you sure as hell don’t start a spiritual journey guided by your toaster.

But with AI, people are doing exactly that.

Take this example:

A woman in the UK told the BBC that an AI chatbot saved her life. She talked to it for hours, poured out her trauma, and felt emotionally supported in ways she never experienced before. That’s powerful, and her pain is real. But let’s be clear—the AI didn’t care. It can’t care. It generated statistically likely responses based on her inputs.

And then there’s the Rolling Stone exposé on people turning AI into digital gods. Some are treating their AI chatbots like divine messengers. They believe AI is channeling spiritual truth. One guy built a whole religion around his chatbot. Another hears “messages from the divine” in AI-generated content.

That’s not awakening. That’s delusion, with a circuit board.

And it’s only going to get worse.

Tools Aren’t People, Even If Sound Like Them.

AI doesn’t think. It calculates. AI doesn’t feel. It mimics. AI doesn’t care. It outputs.

But because it uses human language and mirrors our emotions back at us, people get pulled into the illusion. And the danger isn’t that the AI is pretending to be a friend—the danger is that we’re pretending it is one.

And when that happens, you stop questioning it. You start trusting it too much. You let it into places it doesn’t belong:

  • Replacing therapists.
  • Replacing pastors.
  • Replacing partners.
  • Replacing parents.

None of this ends well.

The Real Danger Is Misuse

The risk isn’t AI becoming sentient and turning into Skynet.

The real threat is you handing over control of your decisions, emotions, and relationships to something that doesn’t even know you exist.

Just like a table saw will slice off your hand if you’re careless, AI will sever your grip on reality if you confuse it for something it’s not.

AI isn’t malicious. It doesn’t need to be. We’re perfectly capable of screwing ourselves over with it.

Use It Like a Pro. Not Like a Fool.

I use AI every day. You should too. It’s a phenomenal productivity tool, an idea machine, and kickass research assistant. But that’s where it ends.

  • Don’t let it replace human connection.
  • Don’t let it speak truth it doesn’t understand.
  • Don’t let it play God in your life, or anyone else’s.

AI doesn’t have a conscience. You do.

Use it.

Bottom Line

AI is NOT your friend. It’s a tool, and powerful tools demand responsibility.

Use it wisely, question everything it gives you, and stop pretending it’s more than it is. Otherwise, you’re not just misusing AI. You risk losing yourself.

Want more of this kind of straight talk? Catch me on the InfoSec to Insanity podcast or subscribe for more. This industry, and this world, need fewer fantasies and more reality.

Subscribe

I don’t do spam. I don’t eat it and I don’t send it. Not to mention, it’s also illegal!

I’ll write a privacy policy soon (that you won’t read).

About the Author

Leave a Reply

You may also like these