Apple just announced a major shift in its AI strategy: iOS 27 will let third-party AI chatbots integrate directly with Siri through a new “Extensions” system. Instead of relying solely on its ChatGPT partnership, users will be able to route queries to Claude, Gemini, Grok, and potentially others.
The Bloomberg report positions this as Apple ending OpenAI’s exclusive arrangement. But the details reveal something more calculated: Apple is opening a door while reinforcing every wall around it.
How Extensions Will Work
According to test versions of the upcoming software, the Extensions system will let “agents from installed apps work with Siri, the Siri app and other features on your devices.”
The practical workflow:
- Download an AI chatbot app from the App Store (Claude, Gemini, etc.)
- Enable it in Settings under Apple Intelligence and Siri
- When Siri can’t handle a question, it offers to send it to your chosen service
- Alternatively, explicitly ask Siri to use a specific chatbot
Apple will provide direct links to chatbot apps within Settings, steering users toward officially supported integrations. The company plans to announce this at WWDC on June 8, with iOS 27 rolling out in the fall.
What’s Actually Opening Up
The integration extends beyond the phone. For the first time, third-party chatbots will work in CarPlay. Developers can build voice chat experiences that auto-launch when apps open, reducing friction while driving.
This is genuinely new territory. Companies like Anthropic and OpenAI have been locked out of in-vehicle AI integration until now. Apple’s opening that access.
The Walls That Remain
But here’s what Apple isn’t changing:
No wake-word access for competitors. “Hey Siri” still only activates Apple’s assistant. Third-party chatbots in CarPlay won’t respond to voice triggers — users must manually open the app first.
No system-level privileges. Claude and Gemini won’t control your iPhone’s settings, access calendar data contextually, or integrate with Dynamic Island. They remain sandboxed apps that receive queries when Siri passes them along.
No replacement of Siri as default. You can’t set Claude as your primary voice assistant. Siri remains the privileged interface for the side button, Spotlight, and system shortcuts.
As Gadget Hacks notes: “Siri is wired into the things that matter most on an Apple device: system controls, on-device user data, default interface real estate.” Apple is letting competitors handle overflow queries while keeping the infrastructure.
The Privacy Question
Apple’s selling point is privacy. But recent research complicates that claim.
Israeli cybersecurity firm Lumia Security found that Apple’s AI ecosystem routinely transmits sensitive user data to Apple servers beyond what privacy policies indicate. Siri sends the content of dictated messages — including WhatsApp communications — to Apple servers. Location data accompanies every Siri request regardless of relevance.
Now Apple is creating pathways for that same data to reach Google, Anthropic, or xAI servers. The company says users must approve each chatbot integration and can disable services individually. But the fundamental tension remains: Apple Intelligence collects data locally and processes it remotely, even as Apple brands itself the privacy-first alternative.
When you route a query to Claude through Siri, what exactly travels to Anthropic’s servers? Apple hasn’t clarified the data minimization approach for third-party Extensions.
Why Apple Is Doing This
The strategic logic is straightforward:
Revenue sharing. When users subscribe to Claude Pro or Gemini Advanced through the App Store, Apple takes its cut. Opening Siri to competitors creates new App Store subscription revenue.
Reduced dependency. The Google partnership for Gemini-powered Apple Intelligence already raised eyebrows about Apple’s AI self-sufficiency. Letting multiple chatbots compete reduces any single provider’s leverage.
Competitive positioning. If users want Claude for coding questions and Gemini for web searches, they can have both — but only on Apple devices. The Extensions system becomes a differentiator rather than a limitation.
What This Means
Apple isn’t abandoning control. It’s creating a managed marketplace where Siri remains the gatekeeper and competitors pay rent.
For users, this means more choice in AI chatbots without truly escaping Apple’s ecosystem. You can use Claude, but you’ll access it through Apple’s interface, under Apple’s rules, with Apple taking its percentage.
For developers, the opportunity is real but constrained. CarPlay integration opens new contexts, but system-level access remains off-limits.
The announcement lands at an interesting moment. Apple’s much-hyped Gemini-powered Siri missed its March deadline and continues development. Meanwhile, OpenAI, Anthropic, and Google race to be the AI interface people actually use.
Apple’s answer isn’t to win that race. It’s to own the track.
What You Can Do
If you’re planning to use third-party AI chatbots through iOS 27:
Review privacy policies for each service. Apple’s Extensions framework will require disclosure, but the actual data handling varies by provider. Claude, Gemini, and ChatGPT have different retention and training policies.
Understand the limitations. These aren’t Siri replacements. They’re additional services you can invoke through Siri. System integration, background access, and voice activation remain Apple-exclusive.
Consider whether you need this. The chatbot apps already exist. iOS 27 makes invoking them slightly more convenient. If you’re concerned about data sharing, sticking with one trusted service might be simpler than juggling multiple Extensions.
Apple’s framing this as opening up. The execution suggests something more familiar: controlled access on Apple’s terms.