Unleashing the True Potential of Apple Intelligence in Third-Party Applications

Apple Intelligence, the new suite of AI capabilities debuting in iOS 18, is set to revolutionize how we engage with apps.

With the traditional App Store model facing increasing scrutiny from regulators, users are now more accustomed to obtaining information through simple queries directed at AI assistants like ChatGPT. Advocates argue that AI may soon become the go-to method for finding answers, boosting workplace productivity, and encouraging creative exploration.

So, what does this mean for the app ecosystem and the burgeoning services revenue, which surpassed $6 billion last quarter, generated by Apple?

The answer lies at the heart of Apple’s AI strategy.

Apple Intelligence introduces a streamlined set of features, including writing assistance, summarization, generative art, and essential tools to enhance user experience. At this year's Worldwide Developers Conference (WWDC) in June, Apple unveiled new functionalities that will deepen the integration of developers’ apps with both Siri and Apple Intelligence.

These smart enhancements empower Siri to access any item within an app menu without further action required from developers. For example, users can prompt Siri with commands like “show me my presenter notes,” and Siri will know precisely how to respond. Additionally, Siri will be able to read and act upon text displayed on the screen. So, if you see a reminder to wish a family member a “happy birthday,” you could simply say “FaceTime him,” and Siri would execute that action seamlessly.

This functionality is already a marked improvement over the basic capabilities of today’s Siri, but it doesn’t stop there. Apple is equipping developers with tools that will enable them to incorporate Apple Intelligence directly into their applications. During WWDC, Apple indicated that the initial rollout of these capabilities would target specific app categories including Books, Browsers, Cameras, Document Readers, File Management, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards, and Word Processors. In the future, Apple is expected to extend these functionalities to all App Store developers.

The AI features will be built upon the App Intents framework, enhanced with new intents designed for developers. This initiative aims to allow users to interact with Siri not only to open apps but also to utilize them efficiently.

This means users will no longer need to navigate through complex app menus to find the functionalities they require; they can simply ask Siri. Conversations can flow naturally, allowing users to reference their context. For instance, a user may request a photo-editing app like Darkroom to “apply a cinematic preset to the photo I took of Ian yesterday.” While today’s Siri may struggle with such requests, the AI-powered version will understand to invoke the app’s Apply Filter intent and identify the correct photo.

Siri will also interpret user commands effectively, even if they include minor errors or refer back to previous parts of the conversation. Moreover, Siri will facilitate actions across different apps. For example, after editing a photo, you could instruct Siri to transfer it to another app, like Notes, without any manual input.

Additionally, Spotlight, the iPhone's search feature, will be enhanced to index and search data from apps, integrating with entities such as photos, messages, files, calendar events, and more.

This more nuanced application of AI hinges on developer engagement. Over the years, Apple’s revenue-sharing policies—typically allowing the company to retain 30% of revenues from products and services sold via any app—have strained relationships with several larger and indie developers alike. Nevertheless, as Siri transforms previously obscured apps within the App Library into easily accessible tools via voice commands, developers could find renewed incentive to collaborate.

Instead of leading users through monotonous onboarding processes, developers can concentrate on ensuring Siri comprehends their app's functions and anticipates user requests. This paves the way for users to interact with apps using Siri in a manner similar to how they currently engage with AI chatbots like ChatGPT.

Third-party developers will also benefit from Apple’s innovative AI framework.

Through a partnership with OpenAI, Siri will be capable of transferring queries to ChatGPT when it lacks the information. Moreover, with the visual search capability available on the iPhone 16 lineup, Apple is enabling users to tap a new Camera Control button, transforming real-time imagery into actionable search queries via OpenAI’s chatbot or Google Search.

While these advancements may not immediately feel as groundbreaking as the launch of ChatGPT, the pace of developer adoption will likely influence their impact. Furthermore, these forthcoming features might appear distant; in the latest iOS 18 betas, functionalities feel somewhat underdeveloped. Although there were moments of surprise at the new Siri’s abilities, there were equally instances of confusion regarding its limitations—even within Apple’s own apps. For example, while you can instruct Siri in the Photos app to send a photo, it won't accommodate more complex tasks like converting the photo into a sticker. Until Siri overcomes these user experience hurdles, the new functionalities may come off as frustrating rather than fluid.

Most people like

Find AI tools in YBX