Skip to main content
  1. Home
  2. Computing
  3. Evergreens

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Apple Intelligence is playing catch-up, here’s what you need to know

Apple's take on AI plans to fundamentally change the way you interact with its products

Add as a preferred source on Google
Apple's Craig Federighi presents the Image Playground app running on macOS Sequoia at the company's Worldwide Developers Conference (WWDC) in June 2024.
Apple

With so many AI companions out there, it only makes sense that Apple utilizes it too. Apple Intelligence is Apple’s take on AI and it plans to fundamentally change the way users can interact with its products. The hopes from Apple are to incorporate machine learning and advanced AI capabilities into every day devices.

While this is Apple’s plan, unfortunately it has fallen far behind in the AI race. Alternatives like Gemini from Google and ChatGPT are already so far ahead to the point that I am considering ditching my iPhone for a Google Pixel.

Recommended Videos

Promising more conversational prose from Siri, automated proofreading and text summarization across apps, and lightning-fast image generation, Apple’s AI ecosystem is designed to enhance user experiences and streamline operations across its product lineup. Here’s everything you need to know about Apple’s supposed transformational AI.

Apple Intelligence release date and compatibility

Apple Intelligence was originally due to release in September 2024 alongside the roll out of iOS 18, iPadOS 18, and macOS Sequoia. However, this was then delayed for a month with a phased release finally beginning at the end of October 2025 starting with U.S English users. More languages and regions became available throughout 2024 and 2025.

You’re only able to use Apple Intelligence on the following devices:

  • iPhone 17 (including Pro and Pro Max)
  • iPhone Air
  • iPhone 16 (including 16e, Plus, Pro and Pro Max)
  • iPhone 15 Pro and Pro Max
  • iPad Pro (M1 and later)
  • iPad Air (M1 and later)
  • iPad Mini (A17 Pro)
  • MacBook Air (M1 and later)
  • MacBook Pro (M1 and later)
  • iMac (and Mac mini [M1 and later])
  • Mac Studio (M1 Max and later)
  • Mac Pro (M2 Ultra)
  • Apple Vision Pro (M2)

Apple Intelligence is still facing ongoing updates with all regions yet to be rolled out. These updates will continue through 2025.

Apple Intelligence features

No matter what device you’re using Apple Intelligence with, the AI focuses primarily on three functions: writing assistance, image creation and editing, and enhancing Siri’s cognitive capabilities.

Apple Intelligence is designed to span the breadth and width of the company’s product line. As such, virtually every feature found in the macOS version of Apple Intelligence is mirrored in the iOS and iPadOS versions. That includes Writing Tools, Image Playground, Memories in Photos, and Siri’s improvements.

In addition, iPadOS, when paired with Apple Pencil, unlocks more features. Smart Script in the Notes app, for example, straightens and smooths handwritten text in real time. The new Math Notes calculator will automatically solve written equations in the user’s own handwriting and generate interactive graphs based on those equations with a single tap.

We at Digital Trends took an early version Apple Intelligence for a spin using macOS Sierra beta, but came away rather disappointed with what we’ve seen so far from the digital agent — a sentiment mirrored by many Apple Intelligence users. For one, only a fraction of the AI tools were actually available to use through the beta release. And the tools we did have access to, including the writing assistant, Siri, and audio transcription, proved buggy and unreliable.

By the time 18.1 was released, Apple had thankfully addressed many of those issues, putting Apple Intelligence on par with more established AI assistants, like Google’s Gemini.

Writing Tools

The new Writing Tools feature can proofread the user’s writing and rewrite sections as necessary, as well as summarize text across Apple’s application ecosystem including Mail, Notes, and Pages. Third-party developers will be able to leverage Writing Tools in their own apps via API calls.

For example, within the Mail app, Apple Intelligence will provide the user with short summaries of the contents of their inbox, rather than showing them the first couple of lines of the email itself (though if you aren’t a fan of that feature, it’s easy to disable). Smart Reply will suggest responses based on the contents of the message and ensure that the reply addresses all of the questions posed in the original email. The app even moves more timely and pertinent correspondence to the top of the inbox via Priority Messages.

The Notes app has received significant improvements as well. With Apple Intelligence, Notes offers audio transcription and summarization features, as well as an integrated calculator, dubbed Math Notes, that solves equations typed into the body of the note.

Image Playground

Image creation and editing functions are handled by the new Image Playground app, wherein users can spin up generated pictures within seconds and in one of three artistic styles: Animation, Illustration, and Sketch. Image Playground is a standalone app, although many of its features and functions have been integrated with other Apple apps like Messages.

Apple Intelligence is also improving your device’s camera roll. The Memories function in the Photos app was already capable of automatically identifying the most significant people, places, and pets in a user’s life, then curating that set of images into a coherent collection set to music. With Apple Intelligence, Memories is getting even better.

The AI can select photos and videos that best match the user’s input prompt (“best friends road trip to LA 2024,” for example), then generate a story line — including chapters based on themes the AI finds in the selected images — and assemble the whole thing into a short film. Photos users also now have access to Clean Up, a tool akin to Google’s Magic Eraser and Samsung’s Object Eraser, and improved Search functions.

Siri

Perhaps the biggest beneficiary of Apple Intelligence’s new capabilities is Siri. Apple’s long-suffering digital assistant has been more deeply integrated into the operating system, with more conversational speech and improved natural language processing. You’ll have to manually enable the feature on your iPhone before you can use it, but doing so is a simple task.

Siri can defer to ChatGPT for more complex queries, which we go further into down below.

What’s more, Siri’s memory now persists, allowing the agent to remember details from previous conversations, while the user can seamlessly switch between spoken and written prompts. Apple is reportedly working on an even more capable version of Siri, but its release may not come until 2026.

Apple Intelligence privacy

With constant data leaks that other AI competitors have suffered, causing mistrust between developers and users, Apple was sure to prioritize privacy when designing Apple Intelligence. This led to Apple building its own private and secure AI compute cloud named PCC (Private Cloud Compute) to handle complex user queries.

Most of Apple Intelligence’s routine operations are handled on-device, using the company’s most recent generations of A17 and M-family processors. “It’s aware of your personal data, without collecting your personal data,” Apple’s senior vice president of Software Engineering stated at WDCC 2024.

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi continued. “If it needs greater computational capacity, it can draw on Private Cloud Compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.” This should drastically reduce the chances of private user data being hacked, intercepted, spied upon, and otherwise snooped while in transit between the device and PCC.

“Your data is never stored or made accessible to Apple,” he explained. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.” The company is so confident in its cloud security that it is offering up to a million dollars to anyone able to actually hack it.

Apple Intelligence will defer to ChatGPT on complex queries

ChatGPT functionality including text generation and image analysis are integrated into Siri and Writing Tools. Furthermore, ChatGPT can step in if Siri’s onboard capabilities aren’t sufficient for the user’s query, except that ChatGPT will instead send the request to OpenAI’s public compute cloud rather than the PCC.

Users won’t have to leave the Siri screen when utilising ChatGPT’s capabilities. OpenAI’s chatbot functions in the background when it is called upon, and Siri will state the answer regardless of which AI handles the query. To ensure at least a semblance of privacy protections, the device will display a confirmation prompt to the user before transmitting their request, as well as for any documents or images the user has attached.

ChatGPT x Apple Intelligence—12 Days of OpenAI: Day 5

ChatGPT is accessible directly from their device’s user interface (regardless of whether it’s iOS, iPadOS, or MacOS) and users will have the option of either logging into their ChatGPT account to access it or using it anonymously.

You’ll also be able to access ChatGPT directly simply by telling Siri to have ChatGPT handle the task (i.e., “Siri, have ChatGPT assemble a holiday music playlist.”)

Apple Intelligence trained on Google’s Tensor Processing Units

A research paper from Apple, published in July, reveals that the company opted to train key components of the Apple Intelligence model using Google’s Tensor Processing Units (TPUs) instead of Nvidia’s highly sought-after GPU-based systems. According to the research team, utilizing TPUs allowed them to harness enough computational power needed to train its enormous LLM, as well as do so more energy efficiently than they could have using a standalone system.

This marks a significant departure from how business is typically done in AI training. Nvidia currently commands an estimated 70% to 95% of the AI chip market, so to have Apple opt instead for the product of its direct rival — and reveal that fact publicly — is highly unusual, to say the least. It could also be a sign of things to come. Nvidia’s market dominance couldn’t last forever — we’re already seeing today’s hyperscalers making moves into proprietary chip production.

Beyond Google’s ongoing TPU efforts, Amazon announced that it’s working on its own chip line, one that would outperform Nvidia’s current offerings by 50% while consuming half as much power.

Andrew Tarantola
Former Computing Writer
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Next-gen AI breakthrough promises chatbots that can read the room better
Researchers are teaching AI chatbots to read between the lines
Generative AI

Have you ever asked a chatbot something and felt like it completely missed your point? You say something with a bit of nuance, and the AI misses the subtlety entirely. That is exactly the problem researchers are trying to solve.

Even though the emotional connection with AI can feel deeper than human conversation for many users, most AI systems today still treat a sentence as a single block of sentiment. If you mix praise and criticism, the nuance often gets lost.

Read more
ChatGPT is not getting an erotic mode, after all
OpenAI pulls back as “adult mode” runs into bigger concerns
ChatGPT-to-rollback-to-friendly-and-adulttt

If you were expecting ChatGPT to get an “erotic mode,” that idea is officially off the table. According to Financial Times, OpenAI’s spicy mode is on hold “indefinitely.”

Inside OpenAI's struggle to bring the adult mode to life

Read more
Turns out, if you ask an AI to play an expert, it gets less reliable
Asking AI to pretend it's an expert can backfire, but researchers may have found a fix.
AI

you’ve probably seen the tip floating around: tell AI to act like an expert in a field, and you’ll get better answers. It’s popular advice, and it does work, sometimes. However, a new study suggests that using AI personas may not be as effective as we thought it would be.

Researchers from the University of California tested 12 different personas across six language models. The personas ranged from math and coding experts to creative writers and safety monitors. The goal was to find out how well AI performs when it is instructed to act as an expert.

Read more