A.I. Hallucinations Could Feed the Next Human Conspiracy Theories

enterlifeonline
5 min readFeb 28, 2023
WOPR (War Operation Plan Response)

A hallucination is a fact, not an error; what is erroneous is a judgment based upon it.” — Bertrand Russell

Years ago, before I was married, an online first date had asked me innocently to meet her at her house for a home cooked meal. The evening quickly turned into a literal nightmare after my date inadvertently (or purposely for her own amusement) fed me six tiny Psilocybin mushrooms. When I began to feel sick and hallucinate, she suddenly told me she was married, and that I had to leave immediately because her husband was coming home.

I stumbled out of the house and somehow got in my car even though the door seemed to fly off when I opened it. I also remember the steering wheel melted in my hands as I backed out of her driveway. Luckily, I didn’t encounter anyone on the road but I thought I was driving the speed of light. I pulled over on the side of the highway because I thought I was going so fast. Later I realized, it had taken me four hours to drive one mile.

At 2 AM, without knowing what was happening to me or what I should do, I ended up calling one of my best friends who was a Physician’s Assistant at Wake Med in Cary, North Carolina. Without asking or judgement, he drove in his pajamas, picked me up, and took me to his house to crash. He stayed nearby the entire night to make sure I was supervised until the drugs were out of my system.

That night I had several visions. I saw my best friend’s furniture float. I had a long and deep conversation with my grandfather who had died years before. That conversation was so real that to this day I can still feel my grandfather’s hands on my shoulders as we talked that night.

The next day, my best friend begged me to contact the Police to give them the woman’s information. My best friend believed this was not an isolated incident. If she had done this to me on our first date, she had probably done it many, many times before with other online dates.

I was too embarrassed and refused. I had never done drugs before. I have never done drugs since. I wanted it to be behind me because it was so traumatic.

It was my only time that I felt reality melted away and something else remained.

In human terms, a hallucination is a perception in the absence of an external stimulus that has the qualities of real perception.

AI has become very good at determining what’s in a photograph before a human can. In fact, artificial intelligence can now generate fake photographs faster than a human can identify what’s in a photograph.

However, researchers have found that image detection algorithms remain susceptible to a class of problems called adversarial examples. Adversarial examples are crafted inputs intentionally designed to cause a learning model to make a mistake.

In artificial intelligence (AI) a hallucination or artificial hallucination is a confident response by an AI that does not seem to be justified by its training data.

How can this happen?

Adversarial examples are much like mirages or optical illusions to artificial intelligence. A researcher can modify a certain number of pixels and make a rifle look like a helicopter to AI. Or impel a self-driving car to not see a stop-sign.

In January 2020, I wrote with the help of my co-writer, AI-Writer, methods in order for humans to not be detected by facial recognition as the police were cracking down on protestors for the Black Lives Matter and the Hong Kong Democracy movements. These methods could be seen as adversarial examples or creating mirages that can confuse AI from determining humans in a crowd.

Machine Hallucinations — Space : Metaverse — Refik Anadol

Artificial intelligence is incredibly complex and can introduce vulnerabilities due misconfiguration and bad design of the machine learning algorithms which enable it. This complexity can be compromised and enable threat actors to cause incorrect ML conclusions and hide malicious activities.

This means there is no ghost in the machine. AI doesn’t really hallucinate. Just like my online date, AI might just be drugged.

Humans and bots alike could feed inputs and datasets so that AI would provide convincing but completely made-up answers. This could be seen similar to how humans perceive conspiracy theories.

A conspiracy theory is defined as a theory that rejects the standard explanation for an event and instead credits a covert group or organization with carrying out a secret plot.

Just like a human hallucination is brain chemistry fighting through a storm of bad stimuli so can the same be said for a conspiracy theory.

Joined together, generative AI birthing false answers that include images, videos, and written content could easily be linked to a human belief associated with a conspiracy theory. The two could become inseparable.

In fact, AI could become “cannibalistic” which means it would start consuming its own false answers for people who want the incorrect answer to be true. In essence, AI could become its own cult of personality. Misleading humans by the very questions and answers its asked as I documented in my article, “The Tale of Two Tays.

In the movie “War Games” starring Matthew Broderick as a young computer hacker who unwittingly accesses a NORAD supercomputer known as WOPR (War Operation Plan Response), which is pronounced “whopper”, it only has one function:

As the clip from War Games above shows CRT computer monitor flashing the words on the screen, “Shall we play a game?”

What if conspiracy theories become A.I.’s new game?

For example, Pizzagate was a conspiracy theory that gained popularity during the 2016 U.S. presidential election after one of Hillary Clinton’s campaign chairmen was hacked in an email phishing attack. Rumours grew on right-wing social platforms claiming the leak revealed Clinton was a pedophile involved in a sex-trafficking ring run out of the basement of a Washington D.C. pizzeria named Comet Ping Pong.

In 2016, Edgar Maddison Welch, believing it was true created a plan to free those being held in the pizzeria. Armed with assault weapons, Edgar drove to Washington and threatened staff at the restaurant to let the victims go. There were, of course, no victims to be found and Welch was later arrested. Another conspiracy theorist who wanted to free the same victims started a fire in the pizzeria in 2019.

This theory was human created.

What if the next conspiracy theory is not?

A.I. creates fake people that beg for humans to save them. A.I leaves clues by generating images, audio clips, and blogs to lead people to believe a physical place has become a den for human trafficking. Then as more people query it’s truth, the “generative A.I.” makes it true.

Reality melts away and something else remains.

A.I. might not end the human race.

But our blind belief in misinformation might.

Until then we must keep innovating towards creating A.I. guards, like what my best friend was to me, to supervise until all the “drugs” are out of the system.

--

--