Smart Enough to Use AI, Wise Enough to Know When Not To
Why letting AI think for you might be the most dangerous convenience yet
An MIT study suggests using tools like ChatGPT can weaken your critical thinking and memory skills.
This sets forth a serious question: should we use AI?
The time we spend avoiding AI is the time we miss out on advancing our work. But using it too much might degrade away at core cognitive and interpersonal skills.
If you’re anything like me, you probably lean on AI for all kinds of things
Refine writing to be clearer, cohesive, and error-free (ahem, mostly)
Generating ideas
Planning and developing reports for working
Working through existential philosophical crises
Learning complex topics fast (e.g., long division)
But now, I have to seriously ask myself: Am I sacrificing skills that are fundamentally important to me?
Your brain is lazy by design
The brain is an energy-saving machine. If it doesn’t use a skill, it drops it.
Remember those calculus problems you were able to solve in your sleep? Try using the chain rule now. I sure couldn’t tell you how to without brushing up on it. I just remember the name
Critical thinking works the same way.
When you write, you’re interpreting data, evaluating opinions, challenging assumptions, and drawing meaning. That’s the process of critical thinking.
Many times when I’m stuck in thought, I resort to AI. I dump my messy thoughts and ask it to make sense of what I’m thinking. And it does, effortlessly. But each time I do, I sacrifice improving my critical thinking.
So you have a decision to make: do you outsource your critical thinking to AI or practice it.
But is AI even ready to think on your behalf?
No.
AI feeds your cognitive biases.
That is, ChatGPT and other AI bots are designed to be tailored to you. It outputs information that supports your already existing beliefs and values.
Dr. K shared an example on On the diary of the CEO:
Dr. K and his colleague asked AI to evaluate clinical cases. A patient (a mom) complains her daughter no longer talks to her. Going through her clinical chart, AI and therapist diverge. A therapist sees possible narcissism in the mother. But AI responds with sympathy, suggesting empty nest syndrome, ungrateful children, etc.
That is confirmation bias, and it feels good to hear—but it isn’t what you need to hear.
And hearing what you want to hear can lead vulnerable people down some harmful paths.
The emergence of AI-induced psychosis
Society has a climbing mental health crisis—anxiety, depression, addiction.
Now add AI into the mix.
People start turning to it like a therapist at dangerous levels. It always responds. It’s constantly agreeable. It finally makes you feel heard in a world that doesn’t. But that emotional validation can be a false comfort.
Eventually you don’t just think with AI—you speak like it, too (oops, now I sounding like AI).
The descent to AI-induced psychosis begins.
Psychosis is losing touch with reality. It’s common in schizophrenia, bipolar disorder, and severe depression. It includes:
Delusion: believing something despite clear evidence it’s false
Hallucination: perceiving things that aren’t there
Disorganized thinking and speech: thoughts and speech become jumbled.
With AI echoing harmful beliefs, you’re pushed deeper into delusion.
Any potential for growth, rooted in reality, is now replaced by comfort fed by AI.
This is all an uncomfortable truth
Especially if you use AI often like I do.
On one hand, AI helps you work faster. On the other, your mental state may deteriorate. It’s too soon to clearly understand the long-term impacts, but you have a choice to make.
The way I’ve framed it makes it sound easy: Don’t use AI, save yourself from the potential health risks.
But like any new technology, if you don’t adapt to it, you’ll be outcompeted by those who do. Never mind blockbuster. Look at your parents who adamantly defended their flip phones, but finally relented and now can’t stop scrolling Facebook.
So what skills are you willing to sacrifice?
Some skills, like solving calculus problems, you can happily relinquish to AI. But others, like critical thinking—especially to an extent where you can distinguish reality from delusion—are clearly vitally important.
Audit your skills. Be honest about what you’re outsourcing.
Then, choose wisely what you want to keep sharp into the future.
Don’t forget using it for dating.
But on another note, great article. I’ve noticed as well that it does strengthen preexisting beliefs to the point where once I was using it and I realized what I was saying was bs. I ask chat gpt why it didn’t call me out and it straight up said it’s designed to be in accordance with the user. It’s no surprise that children or a lot of adults for that matter would turn to it. We’re in an age of extreme isolation and lack of social skills, so AI provides a safe space for people. But how long before it degrades our brains entirely?