Apple's Hesitation on AI Development: What It Means for the Future

It was reasonable to anticipate that Apple would approach AI in the same way it has with many features and apps in the past: by observing, taking notes, and ultimately redefining the landscape. Although the company has softened some of the more controversial aspects of AI technology, it appears to have encountered the same roadblock as its competitors: Apple Intelligence, like other AI systems, ultimately lacks significant functionality.

Sure, it does perform certain tasks—just not to the level of innovation many hoped for. While Apple Intelligence serves as a computationally demanding shortcut for everyday activities, this isn't inherently negative. In fact, as inference (the process of text analysis and generation) becomes more efficient, these capabilities could be executed directly on the device.

However, the initial promises suggested something much more revolutionary. During Monday's “Glowtime” event, Tim Cook touted Apple Intelligence's “breakthrough capabilities” and their “incredible impact,” while Craig Federighi claimed it would “transform how you use your iPhone.”

Here’s what Apple Intelligence can do:

- Rephrase snippets of text

- Summarize emails and messages

- Generate custom emoji and clip art

- Find photos of people, locations, and events

- Perform web searches

Do these features truly feel groundbreaking? Countless writing assistants already exist, and summarization is a basic function of nearly every large language model (LLM). Generating art with AI has become a regular task, and searching through photo databases is easily achievable across various platforms. Even our “dumb” voice assistants have been fetching Wikipedia articles for a decade.

Admittedly, there are improvements: performing these tasks locally prioritizes privacy, allows for new user experiences (especially for those who struggle with traditional touch interfaces), and generally enhances convenience. However, none of these advancements are particularly novel or exciting. Aside from expected bug fixes, there don’t appear to be significant updates to features since their beta release post-WWDC. We'll have to conduct further testing to learn more.

Anticipating “Apple’s first phone built from the ground up for Apple Intelligence,” one might expect an expansive suite of features. Disappointingly, the iPhone 16 will not include several announced features at launch; they will be delivered in a subsequent update.

Is this a failure of vision or a limitation of technology? AI companies are already pivoting, repositioning their products as yet another enterprise SaaS tool instead of the transformative solutions that were widely discussed (many of which were merely regurgitated web content). AI has the potential to be invaluable in specific areas like scientific research, coding tasks, or materials design, but it seems it’s not quite ready for our daily lives.

There’s a perplexing disconnect between the increasing prevalence of AI capabilities and the lavish descriptions surrounding them. Apple appears to have adopted a more exaggerated promotional style, abandoning the restraint and innovation that once defined its brand. Monday’s event was among the least captivating in recent memory, yet the language used was more extravagant than ever before.

Thus, like other AI providers, Apple is playing along in the multi-billion-dollar charade of marketing these models as revolutionary—despite widespread skepticism about their true value. After all, who would justify the enormous investments made if the end result is merely performing tasks that are now five years old?

While AI models may indeed usher in substantial change in specific niches, it's clear that the features Apple promotes don’t deliver anything notably new or groundbreaking. Ironically, Apple’s announcement has failed to provide AI with its long-hoped-for “iPhone moment.”

Most people like

Find AI tools in YBX