blank

The Illusion of Thought: How Everyday Language Misrepresents Artificial Intelligence

Artificial intelligence frequently attracts descriptions borrowed from human cognition. A recent academic study indicates this everyday language quietly shapes public comprehension of machine capabilities.

Researchers from Iowa State University examined the application of mental verbs to artificial intelligence in news media. The study, published in Technical Communication Quarterly, a peer-reviewed academic journal, analyzed the use of anthropomorphism, the attribution of human characteristics to non-human entities.

The research team investigated how writers apply words associated with human cognition to tools like ChatGPT. Jo Mackiewicz, a professor of English and co-author of the study, notes the frequent use of mental verbs to relate to the world. She explains the application of words like “think” or “know” to machines implies non-existent levels of agency and comprehension.

The research shows a cautious approach among journalists regarding artificial intelligence coverage. News writers rarely portray these systems as fully human. Writers often use anthropomorphic terms to describe practical constraints, such as a system’s requirement for data. They also use these terms as shorthand to explain complex technical processes.

Even restrained language influences public perception as artificial intelligence tools integrate into daily workflows. Words that suggest a machine decides or wants something inflate expectations about reliability and autonomy. This language obscures the fundamental role of human designers and operators.

Artificial intelligence systems do not form beliefs or intentions. They generate outputs through pattern recognition in large datasets, collections of structured information. The researchers frame this issue as an ethical concern beyond mere technical accuracy. The perception of artificial intelligence as an independent agent shifts responsibility away from the organizations that construct and operate it.

Clarity in language translates directly into accountability as these tools shape critical sectors like healthcare and finance. Precision in vocabulary serves as a crucial safeguard in this era of rapid technological adoption.


Sources:

Share it...