Apple Rethinks AI Strategy, Opening Siri to Third-Party Chatbot Integrations
Apple's $1 billion Siri overhaul flopped. Now it's turning defeat into a distribution empire, opening iOS 27 to rival chatbots it can tax at 30%.

After burning through roughly $1 billion and two years trying to build a competitive AI assistant, Apple stopped chasing the chatbot race and decided to own the track instead.
Bloomberg reporter Mark Gurman revealed on March 26 that Apple plans to open Siri to outside AI assistants in iOS 27, introducing a new framework called Extensions that would let third-party chatbots, including Google Gemini, Anthropic's Claude, and OpenAI's ChatGPT, plug directly into the Siri experience on iPhone, iPad, and Mac. The formal announcement is expected at the WWDC 2026 keynote on June 8.
The mechanics are straightforward, but the strategic logic runs much deeper than feature parity. Users would install a preferred AI chatbot from the App Store, then enable it as an Extension inside the Apple Intelligence and Siri section of Settings. From that point, Siri routes queries to whichever assistant the user has configured. Apple is preparing to transform Siri from a single-provider AI assistant into a multi-provider platform. Every one of those providers, however, must pass through a storefront Apple controls and monetizes: the App Store's standard 30% commission applies to AI subscriptions purchased through that channel.
Bloomberg's reporting makes clear the Extensions model is expected to generate new App Store revenue. The parallel to Apple's existing search arrangement is hard to ignore. Bloomberg Law's analysis draws a direct comparison between the Extensions structure and the Google-Apple search deal, under which Google paid up to $20 billion annually for default placement on Apple devices. An AI extensions marketplace could replicate that leverage, with multiple companies competing for visibility inside Siri rather than a single payer.
The pivot is also a confession. Apple shuffled its AI leadership twice, promised on-device intelligence features at WWDC 2024 that still have not shipped, and watched rivals absorb the headlines. Amazon, Microsoft, Alphabet, and Meta collectively guided toward roughly $700 billion in AI-related capital expenditures for 2026, while Apple's fiscal 2025 spending came in at $12.7 billion. Rather than try to close that gap on raw model capability, Apple is repositioning Siri as the discovery layer, the interface through which hundreds of millions of iPhone users first encounter any AI at all.
The company's deal with Google goes further than previously understood. Apple received complete access to the Gemini model in its own data center facilities. Apple can use that access to produce smaller models through a process called distillation, powering specific tasks or features on-device. The distilled models would run directly on iPhones and iPads without sending data to Google's servers, a distinction Apple is expected to lean on heavily in its privacy pitch to consumers and regulators alike.
That privacy framing is also where Apple believes it holds a genuine competitive edge over Google and OpenAI, both of which distribute their AI through browser defaults and cloud-first infrastructure that routes user data through their own systems. Apple's argument to developers and to antitrust watchdogs is that Extensions creates an open marketplace rather than a walled preference. Whether regulators in Brussels and Washington accept that framing is a live question, given that Apple sets the commission rates, controls default placement, and determines which apps meet App Store eligibility.
The concept is analogous to how iOS handles default browsers or email clients, but applied to AI intelligence. The browser wars were litigated in courtrooms for years. The AI version of that fight is likely just beginning.
Sources:
Know something we missed? Have a correction or additional information?
Submit a Tip

