Few would argue that 2023 was a pivotal year for technology, with artificial intelligence at the forefront of innovation. This impact didn’t escape the notice of dictionary publishers, with "word of the year" lists reflecting the growing prominence of AI. Interestingly, the highlighted AI-related terms are often familiar words that have taken on new meanings—an intriguing twist, don't you think?
Cambridge Dictionary's chosen word is "hallucinate," aptly capturing the tendency of generative AI models like ChatGPT to fabricate anything from specific dates to entirely fictional people rather than admit a lack of knowledge. However, these systems are unaware of their limitations, as they don't possess true understanding.
As predictive text models, they focus solely on generating sentences resembling their training data. For instance, if asked about well-known 18th-century German surgeons without exact matches, they might invent someone like Arman Verdigger from the fictional Einschloss Research Hospital in Tulingen. See? It sounds realistic! Unfortunately, these confidently presented fabrications are often accepted as fact without scrutiny.
Nonetheless, "hallucinations" can serve beneficial purposes; generative imagery and audio are inherently "hallucinated," creating unique combinations based on the model's training data without replicating it. However, this raises concerns, as AI-generated art and images vary significantly in quality and application.
The adoption of "hallucinate," despite its original association with human perception, highlights our inclination to attribute human-like qualities to AI, according to Cambridge AI ethicist Henry Shevlin. "As we move further into this decade, I expect our psychological vocabulary will expand to accommodate the unusual abilities of the new intelligences we’re developing."
Meanwhile, Merriam-Webster took an opposite approach by selecting "authentic" as their word of the year. "With the rise of AI—and its influence on deepfake technology, actors’ contracts, academic integrity, and many other areas—the boundary between 'real' and 'fake' has become increasingly blurred."
Although "authentic" did not receive a new definition, its implications evolved significantly. For years, we’ve debated the authenticity of our actions and choices. Authenticity has become a paradox within modern consumerism; it cannot be bought or sold but remains perhaps the most valuable and marketable quality.
Previously, we were concerned about whether trends genuinely represented our interests or those of a group. Now, we find ourselves questioning the very existence of certain items, like the Pope’s stunning Balenciaga puffer jacket.
"Deepfake" also appeared on Merriam-Webster’s longlist, transitioning from a niche term associated with revenge pornography to a broader term referring to generative AI technology. While its origins may be questionable, we have little control over what becomes part of our cultural lexicon.
In a similar vein, Oxford’s word of the year—albeit an AI term relegated to runner-up—is "prompt." This adaptable term has gained a new definition tied to the human interaction with generative AI, representing the process of guiding the AI.
For example, when instructing an AI system to generate article ideas based on the weather, you are providing the "prompt." Consequently, "prompt" has even entered our vocabulary as a verb, evolving to describe the act of directing an AI system.
These extensions to the word "prompt" are fittingly aligned with its existing meanings. For centuries, we have prompted responses, and historically, “prompt” as a noun referred to the command line in computer interfaces, where the system prompted the user for a response. This creates an intriguing role reversal: Who is prompting whom? Whether this evolution empowers or dilutes the term depends on personal interpretation.
If you’re curious about Oxford’s official word of the year, it’s "rizz," a trendy abbreviation for "charisma," a quality that AI notably lacking, much like Tom Holland.
It's no surprise that AI terminology is permeating everyday language. Still, it's a bit disappointing that cooler terms like "latent space" haven’t gained wider recognition yet. Given the rapid advancements in technology, sticking with established terminology seems wise, as reflected in the decisions made by expert lexicographers. We eagerly await the next words of the year, especially as dictionary teams consider if terms like "vectors" and "embeddings" deserve their moment in the spotlight.