Apple is reportedly working on a major expansion of its artificial intelligence strategy with iOS 27, which would allow users to choose among multiple third-party AI models for system-level functions, marking a shift away from a single, fixed, Apple-controlled AI engine.
According to industry reporting based on Bloomberg information and referenced by outlets such as MacRumors and TechCrunch, the planned system would introduce a new framework internally described as “Extensions.” This framework would enable Apple Intelligence to connect with external AI providers and route user requests to different models depending on user selection and system configuration.
Under the reported design, core Apple Intelligence features, including Siri interactions, writing assistance, image generation, and other system-wide AI tools, would no longer rely exclusively on Apple’s own models. Instead, users could select from multiple supported AI engines, effectively customising the underlying intelligence powering their device.
The Extensions framework is described as an integration layer that allows AI capabilities from installed applications or external services to be accessed directly by the operating system. Rather than being locked into a single assistant model, Apple devices would dynamically call on different AI systems for tasks based on the user’s preferences.
Reports indicate that Apple is in discussions with several major AI providers about integrating them into this system. These include OpenAI, Anthropic, and Google, with Google’s Gemini models also mentioned as a potential option for Siri or system-level tasks depending on configuration.
Coverage from TechCrunch describes the initiative as a move toward a more flexible, customizable AI environment within Apple’s ecosystem, where users can effectively choose which model powers their device experience. This represents a notable departure from Apple’s traditional approach of tightly controlling core system behaviour through in-house technology.
Analysts suggest the shift reflects both competitive pressure in the rapidly evolving AI space and Apple’s effort to offer greater user choice while maintaining its privacy-focused platform identity. By acting as an intermediary layer rather than a single-model provider, Apple could support multiple AI systems without forcing users into one ecosystem.
Read Also:
- Apple Intensifies AI Push as Global Tech Race Heats Up
- Apple to pay $250 million to U.S. iPhone users in AI lawsuit settlement
If implemented as reported, iOS 27 would transform Apple Intelligence from a unified assistant into a multi-model orchestration system, giving users control over the AI backend used across Apple devices while keeping the interface experience consistent within the operating system.
Senior Reporter/Editor
Bio: Ugochukwu is a freelance journalist and Editor at AIbase.ng, with a strong professional focus on investigative reporting. He holds a degree in Mass Communication and brings extensive experience in news gathering, reporting, and editorial writing. With over a decade of active engagement across diverse news outlets, he contributes in-depth analytical, practical, and expository articles exploring artificial intelligence and its real-world impact. His seasoned newsroom experience and well-established information networks provide AIbase.ng with credible, timely, and high-quality coverage of emerging AI developments.