With the launch of developer betas for [iOS 26.1](https://9to5mac.com/2025/09/22/ios-26-1-beta-1/), [iPadOS 26.1](https://9to5mac.com/2025/09/22/ipados-26-1-beta-1-is-now-available/), and [macOS Tahoe 26.1](https://9to5mac.com/2025/09/22/apple-releases-macos-tahoe-26-1-developer-beta-1/), Apple has started to establish the foundation for incorporating Anthropic’s protocol for agentic AI. Here’s what this entails.
## What’s MCP again?
MCP, or [Model Context Protocol](https://www.anthropic.com/news/model-context-protocol), was introduced last November by Anthropic and quickly became the standard interface within the industry for AI systems and traditional platforms. Essentially, it aims to be the AI counterpart to what HTTP is for the web or SMTP is for email.
According to Anthropic:
> As AI assistants see broader adoption, the industry has poured resources into enhancing model capabilities, experiencing rapid improvements in reasoning and quality. However, even the most advanced models are limited by their separation from data—confined within information silos and outdated systems. Every new data source necessitates its own custom setup, making truly interconnected systems challenging to scale.
>
> MCP tackles this issue. It offers a universal, open standard for linking AI systems with data sources, replacing disparate integrations with a unified protocol. This leads to a more straightforward and reliable method for providing AI systems with the data they require.
Since its announcement, MCP has been embraced by various companies and platforms, including Zapier, Notion, Google, Figma, OpenAI, Salesforce, and many others.
The outcome is a consistent route for AI assistants to connect with APIs and data sources, allowing them to interact autonomously based on a request or prompt from the user.
Here are some examples provided by the [ModelContextProtocol.io](https://modelcontextprotocol.io/docs/getting-started/intro) on what MCP can facilitate:
– Agents can access your Google Calendar and Notion, functioning as a more tailored AI assistant.
– Claude Code can create an entire web application from a Figma design.
– Enterprise chatbots can link to multiple databases within an organization, enabling users to analyze data through chat.
– AI models can design 3D models in Blender and print them using a 3D printer.
## But what about the Apple stuff?
Based on code revealed in today’s betas, we can confirm that Apple is setting the stage to introduce MCP support to [App Intents](https://developer.apple.com/documentation/appintents).
If you’re not acquainted with App Intents, this framework enables apps to present functionalities and content to the system.
Here’s Apple:
> The App Intents framework allows your app’s actions and content to be deeply integrated with system experiences across various platforms, including Siri, Spotlight, widgets, controls, and more. With Apple Intelligence and improvements to App Intents, Siri will suggest your app’s actions to help users discover your app’s features and gain the ability to perform actions within and across apps.
>
> By implementing the App Intents framework, you enable users to personalize their devices by instantly utilizing your app’s functionality with:
>
> – Interactions with Siri, including those that utilize personal context awareness and action capabilities of Apple Intelligence.
> – Suggestions and search via Spotlight.
> – Actions and automations in the Shortcuts app.
> – Hardware interactions that trigger app actions, such as the Action button and squeeze gestures on Apple Pencil.
> – Focus mode to help users minimize distractions.
This indicates that according to today’s code, Apple intends to permit developers to utilize a system-level MCP integration to unveil actions and functionalities within their apps to AI platforms and agents.
In practice, this implies that soon, you might have ChatGPT, Claude, or any other MCP-compatible AI model directly interacting with Mac, iPhone, and iPad applications, autonomously performing actions within these apps without developers needing to undertake the demanding task of fully implementing MCP support independently.
It’s important to note that today’s code suggests an early-stage MCP support, so there may still be some time before we witness this integration launch or even gain official announcement. However, based on what’s already available, this appears promising, and it’s excellent news for bringing agentic AI to the Mac, iPhone, and iPad for users and developers ready to make this transition.
Have you been utilizing MCP-enabled apps and platforms? Share your thoughts in the comments.