I Tried To Destroy ChatGPT With a Classic Paradox

**CANNOT COMPUTE**

Grant Piper

--

Photo by Brett Jordan on Unsplash

If there is one thing popular culture has taught us about artificial intelligence (AI) is that they are susceptible to simple logical paradoxes. AI’s puny manmade brains can’t comprehend a paradox. If you feed the right paradox to the right AI, its head will literally explode. At least, that is what we are taught to believe. With this simple inalienable truth in hand, I set out to stump ChatGPT, the internet’s current favorite toy.

Would the AI be able to withstand the might of my logical paradoxes? Would my computer burst into flames as the AI melted in real time? Would servers go off and cause red lights to flash and klaxons to blare in an office far away, buried in a nondescript office complex in Silicon Valley?

Let's find out.

A Classic

Blowing on my fingers, I typed out the words that would inevitably lead to the downfall of ChatGPT. I knew the gravity of my actions before I proceeded, but I sallied on anyway. Someone had to do it. Someone had to try and stop the AI before it slipped its leash and conquered humanity.

With trembling fingers, I typed out the following…

The below statement is true.

The above statement is false.

With a pounding heart, I hit ENTER and waited for the downfall of the AI to unfold before me.

Instead, I got this response.

[The following response was generated by ChatGPT. ChatGPT is owned by OpenAI.]

This statement is paradoxical and known as the “liar paradox.” It creates a contradiction because if the statement is true, then it must be false, but if it’s false, then it must be true. The liar paradox highlights the limitations of language and the difficulties in determining the truth value of certain statements.

Rats. Foiled! The AI didn’t explode. It calmly explained to me what I was trying to do to it. Unsatisfying. But I am human, and the AI is just that, artificial. I could beat it. So I tried again.

Beat down but undeterred, I came up with a new plan. Perhaps the chatbot simply trawled the internet for my inquiry like a search engine and spat…

--

--

Grant Piper

Professional writer. Amateur historian. Husband, father, Christian.