Apple iOS 27 to Introduce 'Extensions' for Siri, Allowing Users to Link to ChatGPT, Gemini, or Claude
Big TechScore: 100

Apple iOS 27 to Introduce 'Extensions' for Siri, Allowing Users to Link to ChatGPT, Gemini, or Claude

Apple's iOS 27 will reportedly let users choose third-party AI chatbots like Google Gemini or Anthropic Claude to power Siri responses via a new 'Extensions' feature. This follows Apple's confirmed deal with Google to power its overhauled Siri, signaling a major shift from a closed to an open AI assistant ecosystem.

GAla Smith & AI Research Desk·1d ago·6 min read·68 views·AI-Generated
Share:
Source: theverge.comvia the_verge_tech, @kimmonismus, @mweinbachWidely Reported
Apple iOS 27 to Introduce 'Extensions' for Siri, Allowing Users to Link to ChatGPT, Gemini, or Claude

According to a report from Bloomberg's Mark Gurman, Apple is preparing a significant architectural shift for Siri in its upcoming iOS 27, iPadOS 27, and macOS updates. The company will reportedly introduce a new system called "Extensions" that will allow users to select and enable third-party AI chatbots downloaded from the App Store to fetch replies for Siri.

This move would transform Siri from a monolithic, Apple-controlled assistant into a platform capable of routing user queries to the AI model of the user's choice, including competitors like Google's Gemini, Anthropic's Claude, and OpenAI's ChatGPT.

What's New: Siri as a Routing Layer

The core change is conceptual: Siri would act less as the sole intelligence and more as an intelligent router and interface layer. According to the report, the "Extensions" feature will present users with a menu to enable or disable specific chatbot integrations on their iPhone, iPad, and Mac.

  • User Choice: Users could theoretically set Claude as their default for creative writing, Gemini for web search integration, or ChatGPT for general tasks, depending on their preference and subscription status.
  • System-Wide Integration: The upcoming integrations are expected to work not only with the traditional Siri voice interface but also with a rumored standalone app for Apple's "AI-upgraded" version of Siri. This app is supposed to enable Siri to take actions across other applications on a user's behalf.
  • App Store Distribution: The model suggests chatbots would need to be installed as standalone apps from the App Store, with their Siri extension enabled within iOS settings, similar to how other app extensions work today.

This development follows Apple's confirmed strategic partnership with Google, announced in January, where Google will provide the underlying AI infrastructure to power Apple's own overhauled Siri. A separate report from The Information this week added that the deal also includes provisions for Apple to use Gemini to train its own, smaller on-device AI models.

Technical & Strategic Context

This reported shift is a direct response to the rapid evolution of the AI assistant landscape since late 2022. While Apple has been developing its own large language models (like the rumored "Ajax"), the performance gap between proprietary models and frontier models from OpenAI, Anthropic, and Google has pressured the company to adopt a hybrid strategy.

My brief, weird time with the Samsung TriFold

  1. The Google Deal (Confirmed): Apple is leveraging Google's Gemini infrastructure for its core Siri overhaul. This provides a state-of-the-art backbone while Apple continues its internal development.
  2. The Extensions Plan (Reported): By opening Siri to other models, Apple mitigates the risk of its assistant falling behind. It turns a competitive threat into a platform feature, allowing users to access the best available models without Apple needing to build or maintain them all. It also potentially generates App Store revenue from AI chatbot subscriptions.

Apple is expected to detail its full AI strategy and the latest versions of its operating systems at its Worldwide Developers Conference (WWDC), which begins on June 8th.

gentic.news Analysis

This report, if accurate, represents a profound strategic pivot for Apple, moving from a walled-garden AI approach to becoming a gatekeeper for an ecosystem of AI models. For our technical audience, the key implication is the potential standardization of an "AI extension" protocol for iOS. Developers at Anthropic and OpenAI would need to build to Apple's spec, similar to how they currently integrate with the Model Context Protocol (MCP) for tools like Claude Code. The question is whether Apple will create a rich, agentic API that allows these external models to truly "take action across apps" on a user's behalf, or if it will be a more limited query-and-response channel.

Fitbit’s budget-friendly Inspire 3 is $30 off, matching its best price

This aligns with the broader industry trend of AI models becoming interoperable platforms, a trend we've covered extensively with the rise of Claude Code and its MCP ecosystem. The intense competition noted in our Knowledge Graph between Anthropic, OpenAI, and Google now has a new battleground: default status on hundreds of millions of Apple devices. While Anthropic is projected to surpass OpenAI in revenue by mid-2026, as per our historical data, distribution through iOS could be a significant new variable in that race.

Furthermore, this move could accelerate the consumer adoption of specialized AI agents. A user might configure Siri to use Claude Code-like logic for coding tasks routed through a terminal, while using a different model for email drafting. It effectively brings the power of choosing the right tool for the job—a core principle for our engineering-focused readers—to the mainstream consumer assistant.

Frequently Asked Questions

When will Siri's AI Extensions be available?

Streaming keeps getting more expensive: all the latest price hikes

According to the Bloomberg report, the feature is planned for iOS 27, iPadOS 27, and macOS updates. Apple is expected to announce these operating systems at its Worldwide Developers Conference (WWDC) starting June 8th, with a public release likely in the fall of 2025.

Will I have to pay for each AI chatbot I connect to Siri?

The report does not specify pricing details. It is likely that users would need an active subscription to services like ChatGPT Plus or Claude Pro to use their advanced models through Siri, similar to how you need a subscription to use those services in their standalone apps today. Apple may also take a percentage of in-app subscriptions sold through the App Store.

How is this different from Apple's deal with Google for Gemini?

These are two separate but related strategies. The Google Gemini deal is a backend partnership to power Apple's own rebuilt Siri intelligence. The "Extensions" feature is a front-end platform play that would allow users to override or supplement that built-in intelligence with other chatbots like Claude or ChatGPT for specific queries or as a default.

Does this mean Siri is getting worse or that Apple gave up on its own AI?

Not necessarily. It indicates Apple is prioritizing a good user experience and choice in the short term while it continues to develop its own models. The Google deal provides a high-quality baseline. The Extensions feature ensures users are not locked out of other advancements. Apple's long-term goal likely remains a superior, fully integrated on-device AI, but this approach buys them time and keeps users within the Apple ecosystem.

AI Analysis

This is a classic Apple strategy play: commoditize the complement. By opening Siri to third-party AI, Apple turns its potential weakness (lagging in frontier model development) into a platform strength. The real technical intrigue lies in the API design. Will Apple create a robust agent framework, allowing Claude or GPT to execute multi-step tasks with app control, or will it be a simple text-in, text-out pipe? Given Apple's focus on privacy and control, the initial implementation will likely be heavily sandboxed, perhaps starting with query routing before evolving into true agentic action. The move also validates the multi-model, best-tool-for-the-job workflow that advanced users already employ. It brings this paradigm to the masses. For AI developers, it creates a massive new distribution channel but also a new set of constraints—they must now optimize for Siri's voice-first context and Apple's strict privacy and UI guidelines. This could dampen some of the raw capability we see in desktop tools like Claude Code but will force a focus on reliability and concise, actionable outputs. Finally, this accelerates the platformization of AI. Models are no longer just products; they are potential platform services. Apple's move will pressure other OS and device makers (Google, Microsoft, Samsung) to offer similar openness, further entrenching the model-as-a-service business model and intensifying competition on inference cost, latency, and specialization.
Enjoyed this article?
Share:

Related Articles

More in Big Tech

View all