For years, I’ve struggled with turning over to my phone as soon as I wake up and scrolling through notifications I’ve missed overnight. Lately, this bad early-morning habit has become a quick glance at the screen, thanks to my iPhone’s new notification summaries.
A few months ago, I switched to an Android phone, but when Apple rolled out its Apple Intelligence features earlier this year, I was tempted to try them.
Since I couldn’t install them on my old iPhone 15, I borrowed an iPhone 16 Plus from my partner, which comes with an upgraded Neutral Engine designed to run the latest generative AI models. I had to manually activate Apple Intelligence on iOS 18.1, whereas iOS 18.3 enables them by default.
Apple has rolled out a series of new AI-powered features starting with the iOS 18.1 update, and even for someone like me who rarely talks to Siri, many of these have proved game-changing to my iPhone experience and helped me reduce my screentime.
Message summaries powered by AI make group chats more manageable
When there are multiple unread alerts from an app, like email or texts, the new update now condenses them into a brief two-line summary. It shows automatically over a stack of an app’s notifications right on the lock screen.
The iPhone’s software now works like a generative AI chatbot. It has a deeper understanding of language and can scan the notifications for key details like a human.
When I missed a few texts from a particularly active group that was in the middle of planning a vacation, Apple Intelligence’s summary showed me destination suggestions from members and preferred dates, such as, “Janhavi recommends Bali. Most people are comfortable with the end of May.”
Another series of messages from a friend was briefed as “Panic at home and work; coworker fixed an error.” Notification summaries work across apps. Over the last couple of weeks, I’ve relied on it to quickly understand whether a pile of emails, texts, or Slack notifications needs my immediate attention.
There are times when a notification summary is too short or generic to be useful. It can sometimes miss the context. For example, it incorrectly summarized a handful of newspaper headlines and wrote “Netanyahu arrested” for a report about the International Criminal Court issuing an arrest warrant for the Israeli prime minister.
Since then, Apple has paused notification summaries for news and entertainment apps. You can also manually pick which apps’ alerts are summarized or switch off the function altogether.
A new setting mutes non-urgent notifications
Another Apple Intelligence tool that has reduced my screentime is the “Reduce Interruptions Focus” setting. Apple’s new software, the inner workings of which we don’t know much about, analyzes each alert’s urgency based on its content and only shows the ones its AI models think would matter to you while hiding the rest.
Once I place my iPhone in this mode, it typically hides non-essential correspondence, like sales alerts, forwards on messaging apps, or reservation confirmations, and notifies me for others with labels like “Maybe Important.” For example, it didn’t ping me about a news link someone texted me but did alert me bout another message confirming dinner plans from the same person.
Siri is more accurate
With the Apple Intelligence updates, Siri now converses naturally and is more contextually aware.
If you ask a follow-up question to the initial “Hey Siri,” now you don’t have to repeat details from the intial request. If you stutter mid-conversation or converse as you would normally, it won’t lose track of your question. A request like “Siri, set a timer for 10 minutes, wait, sorry, make that 15” results in a 15-minute timer being set — not the classic “I don’t quite understand” reply.
With Apple Intelligence, Siri is also far more accurate at performing actions inside other apps, like texting someone on WhatsApp. Siri has always been notorious for falling short in voice detection, and this update irons out these basic issues. In future updates, Apple has said Siri will let you execute more complex workflows hands-free on third-party services, paving the way for a smart assistant that can control your entire phone.
Siri + ChatGPT = happy Apple user
The Siri upgrade that I use the most is its ability to consult ChatGPT. When Siri doesn’t have an answer, it forwards it to OpenAI’s chatbot, which, in most cases, does. The ChatGPT addition is especially effective for analyzing a visual on your phone.
I can ask Siri, “What’s on my screen?” and it will prompt ChatGPT to describe it. Pressing and holding the new Camera Control button opens up a Visual Intelligence interface, where I can take a picture, and ask questions about it to ChatGPT or look up its contents on Google.
In the last week, I’ve taken pictures of books I’ve completed and used Visual Intelligence to find new, similar ones or to prepare a dinner menu to pair with a bottle of wine I was gifted.
Similar functions have been available on third-party apps, but having them built into iOS makes them much more convenient.
More importantly, Apple’s solutions keep your sensitive data private. You can not only use the ChatGPT integration without an account, but Apple also explicitly prevents OpenAI from training its models on your conversations and media.
It’s not perfect yet. Often, it takes longer for me to get responses from Siri than to launch the ChatGPT app directly. Plus, there’s no live mode like what others (Google Gemini, Perplexity) offer, which means that after each query, I have to activate Apple’s voice assistant again.
Practical AI tools won me over
That said, I appreciate that Apple’s Generative AI update focuses on granular day-to-day practicality.
I’ve outlined the most useful, but other Apple Intelligence tools, like Siri’s ability to record and transcribe calls and the Photos app generating a movie of your media from a brief instruction, have impressed me.
But though Apple Intelligence is already driving iPhone sales, it’s still not enough for me to switch back to it because of the outdated hardware. In the last couple of months, I’ve found it more productive to use AI assistants on larger screens, like the foldable ones many Android phone makers offer.
The post I’m an Android user who tried out Apple Intelligence. Many of its features impressed me, but not enough to make the switch. appeared first on Business Insider.