Apple–Google Gemini Deal Explained: What the $1B Siri Upgrade Means for You
Apple is about to do something that would have sounded impossible a few years ago: pay Google roughly $1 billion a year so Siri can borrow Google’s Gemini AI brain.
TL;DR (too long; talked to Siri)
- The deal: Apple is reportedly close to a $1 billion-per-year agreement to use a custom version of Google’s Gemini AI model to power a revamped Siri. The Gemini model has about 1.2 trillion parameters, far larger than Apple’s current models.
- Why it matters: Gemini should help Siri handle context better, plan multi-step tasks, and deliver more natural answers — a major upgrade over today’s assistant.
- Timing: The overhauled Siri, internally linked to projects codenamed “Linwood” and “Glenwood”, is expected to arrive with iOS 26.4 around spring 2026, according to multiple reports.
- Privacy angle: The custom Gemini model is expected to run on Apple’s Private Cloud Compute infrastructure, meaning Apple — not Google — controls the servers that process Siri requests.
- Big picture: This is a stopgap while Apple races to finish its own giant AI model. It’s also a quiet admission that Siri fell behind and needs outside help to catch up.
Jump to:
1. What’s actually happening between Apple and Google?
Multiple reports from outlets including Bloomberg, Reuters and The Verge say Apple is finalizing a deal to use a custom version of Google’s Gemini AI model as the cloud “brain” behind a redesigned Siri. The model reportedly clocks in at around 1.2 trillion parameters, compared to Apple’s existing models at roughly 150 billion parameters.
In return, Apple is expected to pay Google around $1 billion per year for access to this custom Gemini model. The focus is on giving Siri much stronger abilities in planning, summarizing, and handling more complex, multi-step user requests.
Crucially, this is not Google Search taking over your iPhone. The reported deal is about the assistant layer, not about plugging Google’s AI search directly into Safari or Spotlight. Siri will still look like Siri — it just gets a much smarter brain behind the scenes.
Inside Apple, the overhaul is associated with two codenames you’ll see in leaks and reports:
- Glenwood – the internal project name for using large language models and third-party AI to “fix” Siri.
- Linwood – the codename for the new Siri experience, expected to debut with iOS 26.4 around spring 2026.
2. What will Gemini actually do inside Siri?
Think of Gemini as a powerful co-pilot that lives behind Siri, rather than a separate chatbot you interact with directly.
- Better understanding of context – Gemini’s size and training allow it to keep track of longer conversations, understand what “that” or “it” refers to, and connect details across multiple queries and apps.
- Planning multi-step tasks – You could ask Siri to “plan a 3-day trip to Lisbon, find flights, suggest an itinerary, and add everything to my calendar,” and Gemini would handle the complex reasoning, with Siri executing actions in your apps.
- Summarising information – Gemini is expected to power features that summarise long emails, messages, documents or notification bundles into short, digestible answers Siri can read back or display.
- Mostly invisible branding – Apple reportedly sees Google’s role as that of a behind-the-scenes technology supplier, and may not loudly market “Gemini inside” to users.
Apple’s own models won’t vanish. Reporting suggests Apple will still rely on its on-device AI for highly personal, tightly integrated tasks, while Gemini in the cloud handles the heavy general-purpose reasoning and summarization. That hybrid approach helps balance performance, privacy and cost.
3. Why is Apple paying a rival $1B a year?
The short answer: time and scale. Building a trillion-parameter model, plus the infrastructure to serve it safely to hundreds of millions of users, is a multi-year, multi-billion-dollar project. Apple has strong on-device AI, but has lagged behind in massive, cloud-scale models that power tools like ChatGPT and full-strength Gemini.
- Gemini is production-ready now. By licensing Gemini, Apple can ship a competitive AI assistant in 2026 instead of waiting even longer for its own large-scale model to mature.
- Siri’s reputation needs rescuing. For years, Siri has been seen as weaker than Google Assistant, Alexa and modern chatbots. A high-impact upgrade is critical if Apple wants to be taken seriously in the AI era.
- Interim fix, not a permanent marriage. The deal is widely described as a temporary solution while Apple races to finish its own next-gen large language model that could eventually replace Google’s tech.
It’s also a symbolic reversal: for a long time, Google paid Apple tens of billions to remain the default search engine on iOS. With this deal, Apple will be paying Google — a smaller number, but for a piece of core technology that sits at the heart of the iPhone experience.
4. Will Google see my Siri data?
This is the question that matters most to privacy-conscious users — and to Apple’s brand.
Based on current reporting, the custom Gemini model for Siri will run on Apple’s own Private Cloud Compute (PCC) servers. In other words: the model comes from Google, but the infrastructure and security environment are controlled by Apple.
- Apple-run servers. Requests routed to Gemini should be processed on data centres Apple manages, not on Google’s infrastructure.
- Data separation. The goal is that Siri traffic powered by Gemini stays out of Google’s logs and datasets, even though it uses Google’s model weights.
- Hybrid privacy model. Highly personal queries may still be handled purely by on-device models where possible, with cloud help only when needed.
However, until Apple releases a full technical privacy whitepaper, there will still be open questions: what exactly is logged, how long is it kept, and what cryptographic protections are used. Expect Apple to emphasise independent security audits and verifiability when it presents the feature publicly.
If you care deeply about privacy, keep an eye out for:
- New privacy and Siri settings that let you control when cloud processing is used.
- Whether there’s a “local only” mode for some tasks, even if that means weaker features.
- Clear explanations of how data is handled for training and improvement of Apple’s own models.
5. Timeline, iOS versions and which devices are likely to get it
Nothing is official until Tim Cook or Craig Federighi say it on stage, but here’s what consistent leaks and reports suggest right now:
- Spring 2026 launch window: The revamped Siri is expected to ship with iOS 26.4, likely around March–April 2026, after earlier AI promises slipped.
- Gradual rollout: Advanced Apple Intelligence features often arrive first in English and a small set of markets, then expand. It would be surprising if Siri’s Gemini features didn’t follow that pattern.
- Newer hardware focus: Features that depend on heavy on-device processing plus cloud calls are likely to be limited to recent iPhone, iPad and Mac models with more powerful neural engines.
Until Apple publishes an official compatibility list, expect the full Gemini-backed Siri experience to target its newest hardware and latest OS versions, with older devices either excluded or given a trimmed-down version.
6. What this means for the AI race — and for you
On paper, this deal offers a short-term win for all sides:
- Apple gets a credible generative-AI story for 2026 without waiting for its own largest models to catch up.
- Google gains a new, high-margin revenue stream and deeper influence inside a rival ecosystem.
- Users should finally get a Siri that feels like a modern AI assistant instead of a throwback to 2016.
But there are obvious strategic tensions:
- Dependence vs. control. Apple has built its reputation on controlling the full stack; relying on Google for a core AI layer highlights the gap in Apple’s cloud-scale AI.
- Competition with Android. Apple will depend on the same company that powers Android’s AI. That raises questions about whether Apple will get feature parity and priority updates.
- Regulatory attention. Apple–Google deals around search are already scrutinised by regulators. A deep AI partnership could invite fresh antitrust questions in the US and EU.
For everyday users, it boils down to this: Does Siri finally stop feeling dumb? If the Gemini-powered upgrade delivers genuinely smarter, more useful behaviour — without compromising privacy — most people won’t care who built the model, as long as their iPhone just feels more capable.
7. What you can do now to get ready
- Keep your devices updated. Running the latest iOS, iPadOS or macOS will almost certainly be a requirement for the new Siri features.
- Consider hardware age. If you’re on a much older iPhone or iPad and care about AI features, you may want to factor this upgrade into your next purchase timing.
- Organise your digital life. Smarter assistants work best when your calendar, contacts, reminders and email are not a mess. Future Siri features will rely heavily on that data.
- Review privacy preferences. Get familiar with Settings > Siri & Search and Settings > Privacy & Security on your devices so you’re ready to fine-tune any new AI toggles Apple introduces.
- Experiment with AI now. Trying Gemini, ChatGPT or similar tools today gives you a baseline for what “good” AI feels like — and a way to judge whether Siri’s upgrade really delivers.
8. Quick FAQ
When will the new Gemini-powered Siri actually arrive?
The best current estimate is spring 2026, alongside iOS 26.4, based on reports citing people familiar with Apple’s plans. Until Apple announces it on stage, treat that as an informed expectation, not a guaranteed ship date.
Will this make Siri as smart as ChatGPT or full Gemini on the web?
Probably not in exactly the same way. Apple’s custom Gemini model is tuned for assistant-style tasks like planning, summarisation and working with apps, not for endless, open-ended chat. Apple may also deliberately constrain some behaviours to keep Siri reliable and on-brand. Expect a big leap over today’s Siri, but not a full ChatGPT clone.
Does this mean Apple gave up on building its own AI?
No. Reporting consistently says that Apple still aims to ship its own trillion-parameter class model and eventually replace Gemini. Licensing Gemini buys Apple time, letting it ship a competitive Siri while its in-house research and infrastructure scale up.
Is Apple going to rebrand Siri?
So far, leaks talk about Linwood and Glenwood as internal project names. There’s no solid reporting that Apple will kill the “Siri” brand. It’s more likely that Siri keeps its name but gets a new engine and more visual, AI-heavy interface.
Will older iPhones miss out?
Almost certainly some will. Apple typically limits its most demanding AI features to newer chips with bigger neural engines. Expect the full experience to focus on recent iPhone, iPad and Mac models, with older hardware either unsupported or offered a reduced set of capabilities.
Further reading & sources
Key reporting behind this explainer:
- Reuters – Apple to use Google’s AI model to run new Siri
- TechCrunch – Apple nears deal to pay Google $1B annually to power new Siri
- 9to5Mac – Apple nears $1 billion Google deal for custom Gemini model to power Siri
- The Verge – Apple is planning to use a custom version of Google Gemini for Apple Intelligence
- The Decoder – Siri will get a Gemini-powered brain transplant
Want a deeper dive into Google’s model itself? Read our explainer on what Google Gemini is and how it works.
For more AI coverage, browse the Technology and AI sections on ReadGlobe.

Leave a Reply