Apple Intelligence is finally here, and like most AI on smartphones so far, it's largely disappointing.
Apple Intelligence's early features are all very familiar: there are glowing gradients and sparkling icons that indicate the presence of AI; writing tools that make your emails sound more professional; and an AI eraser in photos that removes distractions. It's all there and everything works well. But none of this comes close to the time-saving computing platform switch we were promised.
Essentially, there are two Apple Intelligences: the one that exists now and the one we may see in the future. Even in today's launch announcement, Apple is busy announcing the features that haven't rolled out yet. What's here today is a handful of tools that broadly have one common theme: they help you weed out distractions and find the signal in the noise. That's the theory anyway.
Apple uses AI to aggregate groups of notifications so you can catch up on what you missed faster. You can summarize long emails and use a new focus mode that filters out unnecessary distractions. In practice these things work somewhat, but after using them for a week I don't feel like I've saved much time or energy.
In the Mail app, AI summaries appear where the first line of an email would normally appear when you view an entire inbox. There is also the option to combine individual emails. Perhaps it's a reflection of how useless email has become, but I didn't find any of these features particularly helpful. Do you know what feature we already use to summarize an email pretty well? The subject line. At least that's true for most of the emails I receive. They are usually short and to the point. Maybe Tim Cook saves a lot of time reading long emails, but personally I could live without a little summary of every email the DNC sends me asking for three dollars by midnight.
Notification summaries seem a bit more promising to me – at least it's pretty funny to see the AI trying to summarize a series of gossip texts or a series of notifications from your doorbell. But some important information also emerged in a series of text messages from a friend, and if I hadn't seen that summary when I looked at my phone, I might have read the messages much later. That was helpful.
Over in Photos, you'll find the new Cleanup tool in your editing options. It is designed to quickly remove objects from a scene. You can tap something that the tool has automatically highlighted, or sketch something yourself that you want to remove. It runs on the device, so all you have to do is wait a few moments and watch the selected object (mostly) disappear.
The tool is particularly useful for smaller objects in the background. But it's only about as good as Google's older Magic Eraser tool in Google Photos – occasionally it's better, but not as good as Google's Magic editorthat uses generative AI for incredibly convincing object removal. This tool runs in the cloud, so that's a small difference, but still. I can use Google Photos' on-device Magic Eraser tool on my four-year-old iPhone 12 Mini, and the results are pretty close to what I get with Clean Up on the iPhone 16 – not a good argument for the AI phone upgrade- Cycle.
Of course there is also an improved Siri. Sure, it looks different and typing queries is a handy addition, but you don't have to use it long to realize that it's basically the same old Siri with a new coat of paint. It does a better job with natural language and includes more product knowledge to help you find settings on your iPhone, but that's about it for now. Apple has promised major updates to Siri in the future, and features like a ChatGPT extension are expected to be available by the end of the year. But the big things – contextual awareness, the ability to take action in apps – are all slated for 2025.
Other features – like AI-generated photo saves and smart replies – do what they're supposed to do, but lack a certain human touch. I didn't send any of the AI-suggested responses in my messages, even though they conveyed the right sentiments. If I'm going to take the time to respond to a text, I might as well write “This is hard” myself instead of letting the AI do it, you know? Isn't that part of the point of texting someone? I also had Photos create a memoir of my child's moments, which happened but titled it with the eerily impersonal title “Happy moments with child.”
To be clear: criticism of Apple Intelligence is not an endorsement of the intelligence of other phones; Right now they are all unhelpful to varying degrees. If you want to make it look like a helicopter crashed in an empty field, then there's certainly AI for that. But what if you need help completing tasks? This isn't quite finished yet.
And to be fair, this is version 1, and Apple has made it pretty clear that its more impressive intelligence features will be disappearing over the next year. But Apple has also branded every new iPhone, iPad and Mac it currently sells with the big, bright “Built for Apple Intelligence” logo, suggesting that we'd be sorry if we bought an Apple device that can't handle AI. If Apple Intelligence is a disappointment right now, it's because Apple has built it up to impossible heights.
There's more to come, and some of it looks really promising. With this first wave of AI features, Apple is catching up with Google and Samsung. But no phone maker has yet developed a cohesive set of time-saving AI tools. Apple may be late, but the game is just beginning.