Automated incompetence

My experiences with artificial intelligence worry me 

File art

I don’t like artificial intelligence, but I can’t escape it.

The other day I asked Siri, the AI “assistant” on my iPhone, to lookup Transylvania. Siri gave me the definition of “transgender.” 

Lesson learned: Look it up yourself—and use a book. (I generally use paper and ink dictionaries for spelling and definitions.)

On another occasion, I asked Siri for synonyms for the word “guess”. Siri gave me synonyms for the word “gas”. (Siri, you need hearing aids.)

A couple of months ago, I wanted to look up an article about Pope Leo. 

Google’s AI informed me there was no current pope named Leo, despite the pesky fact that the College of Cardinals elected him on May 8 of 2025. 

Artificial intelligence is everywhere and it’s growing faster than most cancers. Like cancer, it takes many forms and grows without thought, without regard for anyone. Like cancer, it lacks the personality to be evil. Also like cancer, it has the potential to cause as much pain, misery, debt, and death as evil. 

Maybe it’s unfair to compare artificial intelligence to cancer. It’s more like a car without breaks. No car manufacturer in their right mind would release a car without an effective breaking system. No driver in their right mind would buy such a car.

But nations and businesses are adopting this infantile technology and letting it grow on its own, unchecked and without thought.

I once read an article that was obviously written by a “robot”. It described a fatal fire as a “successful fire”. Few humans would write such monstrous words. 

Once, as an experiment, I asked AI to compare two of my favorite mystery writers: John D. MacDonald and Ross Macdonald. In the third paragraph, the computer program told me that both writers set novels in California. No, Ross Macdonald wrote Lew Archer mysteries that were usually set in Southern California. John D. MacDonald wrote Travis McGee adventures that with few exceptions were set in Florida. 

There have been documented cases of artificial intelligence getting basic math wrong. (When I have to do math for a news story, I use two different calculator programs.)

Several years ago, I phoned 411 to search for a business number in Huntington Beach.

“Say a city and state, please.”

“Huntington Beach, California.”

But 411 didn’t recognize Huntington Beach—a computer program didn’t know the city existed and kept asking me to say a city and state. No matter how loudly I yelled, 411 didn’t have the information. Good thing I didn’t need the Huntington Beach Police, huh?

One day, weapons of war may well be controlled by AI. This worries me. You can court martial a soldier for killing civilians. You can court martial a sniper for killing the wrong person. You can’t court martial AI—not even if it launches a missile (with or without a nuclear warhead).

On balance, I think I’d trust a drunk with a gun more than I would AI with deadly force-—and I wouldn’t trust a drunk with a gun at all.

Yet governments and companies are embracing, and deploying, this technology as if it were fully developed instead of something with potential.  I don’t like artificial intelligence, but we can’t escape it. But I rather wish someone had an escape plan—just in case.

Charles M. Kelly is the associate editor of the Sun. His opinions are his own.