DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home Tech Apps

Apple Intelligence: AI, features, research delays, and supported devices

June 17, 2025
in Apps, News
Apple Intelligence: AI, features, research delays, and supported devices
493
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Apple Intelligence is the name of Apple’s effort in Artificial Intelligence. The company says it “draws on your personal context while setting a brand-new standard for privacy in AI.” It was introduced during the WWDC 2024 keynote, and it’s a central part of Apple’s iPhone, iPad, and Mac devices, starting with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1. Apple Vision Pro also supports this platform with visionOS 2.4.

Withe iOS 26 announcement, the company is aiming at even more features arriving later this fall.

Release date

Apple Intelligence has been available for Apple devices since October 28, 2024, with iOS 18.1, iPadOS 18.1, and macOS 15.1. Apple Vision Pro users have had access to Apple Intelligence since March 31, 2025. So far, the company is limiting these features to beta.

At the WWDC 2025 keynote, Apple announced several other features are coming later this fall.

iOS 26, iPadOS 26, and macOS Tahoe upcoming Apple Intelligence features

Apple Intelligence with iOS 26, iPadOS 26, and macOS 26 Tahoe
An iPhone 15 Pro charging. Image source: Apple Inc.

Apple plans to keep improving existing Apple Intelligence features, while also adding some important tweaks to this platform. Here’s everything we could find about it so far:

  • Live Translation: Whether you’re using the Messages, Phone, or FaceTime apps, Apple will suggest Live Translation feature so you can communicate with others across different languages by automatically translating text and audio.
  • Onscreen ChatGPT awareness: If ChatGPT powers your Apple Intelligence experience, you can ask ChatGPT questions about what’s on your screen. The AI can also search on Google, Etsy, and other supported apps for you.
  • Genmoji improvements: With iOS 26, it’s possible to mix emoji, Genmoji, and descriptions together to create a new image. It’s also possible to combine two emojis to create a different one.
  • Image Playground improvements: Image Playground can now be powered by ChatGPT. The AI offers five different styles to create your next images.
  • AI-powered Shortcuts app: There are several new Apple Intelligence-powered shortcut suggestions. For example, you can even create an AI chatbot while Apple doesn’t offer one itself.
  • Order tracking details: Apple Intelligence can automatically identify and summarize order tracking details from emails merchants sent you.
  • Visual intelligence improvements: Apple improves how you can add events to a calendar when you’re pointing the camera at it. In addition, it’s possible to take a screenshot and ask Visual Intelligence to find the item or something similar for you using Google, Etsy, and more.
  • Foundation Models Framework: With iOS 26, Apple is offering developers the ability to have direct access to intelligence features on their apps. They can be available offline, and Apple doesn’t charge anything from developers.

Finally, watchOS 26 also has an Apple Intelligence feature, even though an iPhone 15 Pro or newer is required to be around:

  • Workout Buddy: Apple create an AI coach, which can encourage you based on your workout data and fitness history. This text-to-speech model ““generates personalized pep talks using voice data from an Apple Fitness+ trainer, so it has the right energy, style, and tone for a workout.”

On-screen awareness Siri delay

Siri gets an AI-powered upgrade in iOS 18.
Siri gets an AI-powered upgrade in iOS 18. Image source: Apple Inc.

During Apple’s Q2 earnings call, CEO Tim Cook was questioned about the Siri delays. To investors, Cook said, “We just need more time to complete the work so that they meet our high-quality bar.”

Still, the CEO barely touched on one of the company’s most anticipated features and one of its biggest recent failures. Back in March, Apple announced it would postpone the all-new Siri indefinitely.

Reports following that announcement revealed that Apple didn’t have a working product when it teased the new Siri during the WWDC 2024 keynote. Even months later, the company struggled to get it working correctly.

So instead of gradually adding new features ahead of iOS 18.4, Apple decided to delay the all-new Siri. The company has reportedly reshaped the Siri team, intending to deliver a better experience in the coming years.

Even so, a report revealed that Cupertino still plans to release the delayed Siri by iOS 26.4. According to Bloomberg‘s Mark Gurman, Apple plans to debut this AI-powered upgrade for Siri by the end of the iOS 26 cycle, when it believes it will have most of Siri’s “v2” LLM ready.

iOS 18, iPadOS 18, and macOS Sequoia features

Apple Intelligence feature summary.
Apple Intelligence feature summary. Image source: Apple Inc.

These are some of the Apple Intelligence features previewed during the WWDC 2024 keynote for iPhone, iPad, Mac, and Apple Vision Pro that are now available to users:

  • Writing Tools: Users can rewrite, proofread, and summarize text nearly everywhere they write, including Mail, Notes, Pages, and third-party apps.
  • Image Playground: Users can create playful images in seconds, choosing from Animation, Illustration, or Sketch. This app is built right into apps like Messages and is also available in a dedicated app.
  • Memories in Photos: Users can create stories they want to see by typing a description. Apple Intelligence will pick out the best photos and videos based on the description, craft a storyline with chapters based on themes identified from the photos, and arrange them into a movie with its own narrative arc.
  • Clean Up tool: This Photos app feature can identify and remove distracting objects in the background of a photo without accidentally altering the subject.
  • Siri: Users type to Siri and switch between text and voice to communicate with Siri in whatever way feels right for the moment.
  • ChatGPT integration: When you feel Apple Intelligence isn’t enough, you can allow ChatGPT to access Writing Tools and other features for a better response.

iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 start Apple Intelligence revolution

Apple Intelligence running on M4 iPad Pro

In July 2024, Apple released the first beta of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1. Exclusive to iPhone 15 Pro and M1 (or newer) devices, Apple released these operating system updates by the end of October with these functions:

  • Writing Tools: Proofread your text, rewrite different versions until the tone and wording are right, and summarize the selected text with a tap.
  • Improved Siri: With a new design, Siri can maintain context between requests. Even if you stumble over words or shift what you’re saying mid-sentence, Siri can understand what you actually want.
  • Priority notifications appear at the top of the stack, letting you know what to pay attention to at a glance. Notifications are summarized so you can scan them faster.
  • Priority messages in Mail: Elevate time-sensitive messages to the top of your inbox, like an invitation that has a deadline today or a check-in reminder for your flight this afternoon.
  • Record and transcribe calls in the Notes app: Just hit record in the Notes or Phone apps to capture audio recordings and transcripts. Apple Intelligence generates summaries of your transcripts, so you can get to the most important information at a glance.
  • Reduce interruptions: An all-new Focus Mode understands the content of your notifications and shows you the ones that might need immediate attention, like a text about picking up your child from daycare later today.
  • Smart Reply in Mail: Quickly draft an email response with all the right details. Apple Intelligence can identify the question you were asked in an email and offer relevant selections to include in your response.
  • Clean Up: This Photos app feature can identify and remove distracting objects in the background of a photo without accidentally altering the subject.
  • Summarization: Apple Intelligence summarizes more than just Messages and Mail notifications.

iOS 18.2 features

How to use Genmoji in iOS 18.2

iOS 18.2 was released in December 2024. These are some of the Apple Intelligence features:

  • Genmoji support: Create custom emojis by combining two figures; users can type a prompt like “monkey with pink hat” to make an entirely new emoji — here’s how to use Genmoji on your iPhone
  • Image Playground: Users can create playful images in seconds, choosing from Animation, Illustration, or Sketch. This app is built right into apps like Messages and is also available in a dedicated app
  • ChatGPT integration: When you feel Apple Intelligence isn’t enough, you can allow ChatGPT to access Writing Tools and other features for a better response
  • Visual Intelligence: It helps users learn about objects and places faster than ever. Users can click and hold Camera Control to pull up the hours or ratings for a restaurant they pass, add an event from a flyer to their calendar, quickly identify a dog by breed, and more
  • Image Wand: “Rough sketches can be turned into delightful images, and users can even select empty space to create an image using context from the surrounding area” in the Notes app
  • Create Images expansion: Another Apple Intelligence feature available with iOS 18.2 beta 2 is the ability to create an image when you highlight text in the Notes app.

iOS 18.3 features

iOS 18.3 was released in January 2025. This software tweaks the Notification Summaries feature:

  • Easily manage settings for notification summaries from the Lock Screen
  • Updated style for summarized notifications better distinguishes them from other notifications by using italicized text as well as the glyph
  • Notification summaries for News & Entertainment apps are temporarily unavailable, and users who opt in will see them again when the feature becomes available

iOS 18.4 features

Apple Intelligence now summarizes notifications in more languages at the same time

iOS 18.4 was released in March 2025. These are some of the Apple intelligence features available:

  • New languages: Apple adds Chinese, French, German, Italian, Brazilian Portuguese, Spanish, Japanese, Korean, and localized English for Singapore and India.
  • Image Playground: The long-awaited Sketch style is now available alongside the Animation and Illustration options.
  • Genmoji: Apple tweaked the Genmoji icon on the keyboard, as it now reads “Genmoji.”
  • Mail Categorization: Apple added Mail Categorization to iPad users with iPadOS 18.4 and macOS 15.4.
  • Apple Vision Pro support: visionOS 2.4 adds the AI platform to Vision Pro users.
  • Visual Intelligence: Apple expanded the Visual Intelligence feature to the Action Button so iPhone 15 Pro and iPhone 16e users could take advantage of it.

Here’s how to use Apple Intelligence.

Visual Intelligence comes to all AI-powered iPhones

Using the Camera Control button to find out information about a restaurant with Visual Intelligence.
Using the Camera Control button to find out information about a restaurant with Visual Intelligence. Image source: Apple Inc.

During the iPhone 16 event, Apple revealed the new Camera Control feature. It would unlock visual intelligence to help users learn about objects and places faster than ever before. Users could click and hold the Camera Control to pull up the hours or ratings for a restaurant they pass, add an event from a flyer to their calendar, quickly identify a dog by breed, and more.

Camera Control also serves as a gateway into third-party tools with specific domain expertise, such as when users want to search Google to find where they can buy an item or to benefit from ChatGPT’s problem-solving skills. However, with the iPhone 16e announcement, Apple revealed visual intelligence would be available on the Action Button, and that it would also make its way to iPhone 15 Pro.

With iOS 26, Apple will superpower Visual Intelligence to recognize when a user is looking at an event and suggesting adding it to their calendar. In addition, users can take screenshots and use a Circle to Search-like functionality to search for an object, inspiration, or something they want to buy.

  • How to use iPhone 16’s Visual Intelligence feature
  • iPhone 16e and 15 Pro finally got the one Apple AI feature worth using

Apple Intelligence is available in 10 languages with eight more coming soon

Apple Intelligence expansion

During the iPhone 16 event, Apple announced it would expand Apple Intelligence to more languages in 2025. In addition to English and its variations for Australia, Canada, India, New Zealand, South Africa, the UK, and the US, it’s also available in the following languages:

  • Chinese
  • French
  • German
  • Italian
  • Japanese
  • Korean
  • Portuguese (Brazil)
  • Spanish
  • Vietnamese

By the end of 2025, Apple Intelligence will be available for the following languages:

  • Danish
  • Dutch
  • Norwegian
  • Portuguese (Portugal)
  • Swedish
  • Turkish
  • Chinese (Traditional)
  • Vietnamese

Apple Intelligence in China

Apple’s Deirdre O’ Brien followed the iPhone 15 release in China Image source: Apple Inc.

Following reports that Apple planned to launch Apple Intelligence in China, we now know how the platform will work in the country.

According to Bloomberg, several teams in China and the US are working to adapt the Apple Intelligence platform for the region, with Alibaba powering this experience. With a launch date expected as soon as May, this means a proper update will come a little after Apple starts offering Chinese support for its AI features with iOS 18.4.

Bloomberg explained that Alibaba would “censor and filter AI output to comply with requirements from the Chinese government.” Additionally, Baidu will handle other AI features, such as Visual Intelligence and searches. While previous reports said Baidu wouldn’t be involved with Apple Intelligence, the publication says the Chinese company will also have a role in this launch.

Since Apple Intelligence has three main types of AI (on-device features, server-side functions, and OpenAI capabilities), Apple will use the same on-device AI models as it does in the rest of the world, even though Alibaba will serve as a “layer on top that can censor material that the government objects to,” for server-side features, the company might rely on its GBCD iCloud partner. Baidu will be responsible for the Visual Intelligence part.

Alibaba recently released the models necessary to power Apple Intelligence. We expect the platform to be available in the country with iOS 18.6.

Tim Cook explains Apple and OpenAI’s ChatGPT partnership

Apple AI: Tim Cook explains in interview

Rumors were true, and Apple has partnered with OpenAI. According to the company, these two projects work seamlessly, but they have core features that separate them.

With Apple AI, the company ensures that all data is private through Private Cloud Compute, while OpenAI’s ChatGPT usually collects user data. In an interview with YouTuber Marques Brownlee, Apple’s CEO Tim Cook explained the core difference between Apple Intelligence and ChatGPT partnership.

“There’s Private Cloud Computing, and there’s the arrangement with OpenAI,” says Tim Cook. “These two things are different. So, if you look at Private Cloud Compute, we’re utilizing the same basic architecture as the silicon that’s in the iPhone 15. We’re using the same software, and so we believe that we’ve done it in such a way that it’s as safe, secure, and private in the Private Cloud Compute as in the device.”

That means Apple won’t collect user’s data, won’t make a profile of the user, or take this data to sell it elsewhere. Cupertino aimed to extend the iPhone’s on-device processing to the next level with a level of security that people are used to with their iPhones.

Tim Cook continues: “So we really, we really worked on this on a lot and put a lot of work behind that arrow to be sure that if you’re working on something that requires world knowledge, so you’re out of the domain of personal context and so forth, then you may want to go and use one of the large language models that are on the market, and we will be selected what we feel is the best one with OpenAI and ChatGPT.”

That said, all personal requests related to Apple’s built-in apps, such as Messages, Mail, Calendar, and more, will use the company’s intelligence. In contrast, “world knowledge” can be requested for OpenAI ChatGPT and later for other large language models.

New LLMs can join the party later

While Apple first integrated with OpenAI, the company plans to work with other LLMs as well. For example, Cupertino is in talks with Google about licensing Gemini. It’s unclear when Gemini support is coming, but Aaron Perris posted a screenshot on X that shows Google showing up in a backend update that came with the iOS 18.4 beta.

Apple Intelligence compatible devices

How to fast charge iPhone 15 Pro

During the WWDC 2024 keynote, Apple announced which devices will be compatible with its Intelligence:

  • iPhone 15 Pro models or newer
  • M1 iPad models or newer (such as the M4 iPad Pro) and the new iPad mini (A17 Pro)
  • Apple Silicon Macs running macOS Sequoia
  • Apple Vision Pro

Apple papers suggest where its AI efforts are at

Pixelmator Pro improves PDF editing

AI model for instruction-based image editing

In February, Apple released a revolutionary AI model for instruction-based image editing. According to a paper published by Apple researchers, instruction-based image editing improves the controllability and flexibility of image manipulation via natural commands without elaborate descriptions or regional masks. The study shows “promising capabilities in cross-modal understanding and visual-aware response generation via LM,” as they investigated how MLLMs facilitate edit instructions and MLLM-guided image editing.

Apple’s image editing AI model can produce concise and clear instructions for the editing process, create Photoshop-style modifications, optimize photo quality, and edit specific elements of a picture, such as faces, eyes, hair, clothes, and accessories.

MM1: Apple’s AI model

In March, Apple researchers published a paper highlighting how they’re training a new large language model (LLM).

Called MM1, this LLM can integrate text and visual information simultaneously. The paper offers an interesting look at the importance of various architectural components and data choices. The researchers say they were able to “demonstrate that for large-scale multimodal pre-training using a careful mix of image-caption, interleaved image-text, and text-only data is crucial for achieving state-of-the-art (SOTA) few-shot results across multiple benchmarks, compared to other published pre-training results.”

In addition, they showed that “the image encoder together with image resolution and the image token count has a substantial impact, while the vision-language connector design is of comparatively negligible importance.”

Apple’s MM1 AI model uses a family of multimodal models with up to 30 billion parameters, consisting of both dense models and mixture-of-experts (MoE) variants, that are state-of-the-art in pre-training metrics and achieve competitive performance after supervised fine-tuning on a range of established multimodal benchmarks.

ReALM could be better than OpenAI’s GPT-4

iOS 18.1 Apple Intelligence on iPhone 15 Pro all-new Siri design
iOS 18.1 Apple Intelligence on iPhone 15 Pro: The all-new Siri design Image source: José Adorno for BGR

Apple researchers have published a paper about a new AI model. According to the company, ReALM is a language model that can understand and successfully handle contexts of different kinds. With that, users can ask about something on the screen or run in the background, and the language model can still understand the context and give the proper answer.

This is the third paper regarding AI that Apple has published in the past few months. These studies only tease the upcoming AI features of iOS 18, macOS 15, and Apple’s newest operating systems. In the paper, Apple researchers say, “Reference resolution is an important problem, one that is essential to understand and successfully handle context of different kinds.

One example is a user asking for pharmacies near them. After a list is presented, something Siri could do, the user could ask, “Call the one on Rainbow Rd.,” “Call the bottom one,” or “Call this number (present on-screen).” Siri can’t perform this second part, but with ReALM, this language model could understand the context by analyzing on-device data and completing the query.

Ferret LLM

This paper explains how a multimodal large language model can understand user interfaces of mobile displays. The researchers say they have advanced in MLLM usage but still “fall short in their ability to comprehend and interact effectively with user interface (UI) screens.” 

This assistive assistant is still far from being released. But once Apple masters it, it could be integrated alongside ReALM model.

BGR will update this guide as we learn more about Apple’s AI efforts.

The post Apple Intelligence: AI, features, research delays, and supported devices appeared first on BGR.

Tags: AppleApple Intelligence
Share197Tweet123Share
At the Combs Trial, a Detailed Timeline of the Hotel Assault on Cassie
News

At the Combs Trial, a Detailed Timeline of the Hotel Assault on Cassie

by New York Times
June 17, 2025

Prosecutors in Sean Combs’s federal trial on Tuesday showed jurors charts that detailed phone records and text messages related to ...

Read more
News

Broadway’s ‘Real Women Have Curves’ to Close Because of Soft Sales

June 17, 2025
News

Jason Isaacs reveals the ‘very low price’ the ‘White Lotus’ cast was paid per episode

June 17, 2025
News

How Disney is using Unreal Engine 5 to add major upgrades to Millennium Falcon: Smugglers Run

June 17, 2025
News

Luisaviaroma Promo Code: 20% Off in June 2025

June 17, 2025
Air Supremacy Over Tehran Gives Israel a Decisive Edge—And Raises New Risks

Air Supremacy Over Tehran Gives Israel a Decisive Edge—And Raises New Risks

June 17, 2025
Trump Throws Tulsi Gabbard Under the Bus in His Iran Policy

Trump Throws Tulsi Gabbard Under the Bus in His Iran Policy

June 17, 2025
Brad Pitt’s ‘F1’ Movie Is His Own ‘Top Gun: Maverick’

Brad Pitt’s ‘F1’ Movie Is His Own ‘Top Gun: Maverick’

June 17, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.