We’ve talked for months about how Apple will put its own spin on AI features for the iPhone via iOS 18. More recently, we learned that all the new AI features Apple plans to unveil at WWDC 2024 will be branded as “Apple Intelligence.”
That’s a brilliant and lucky way to turn “AI” into an Apple-branded product, and it’s something Apple’s rivals can’t match. However, most people will probably not use “Apple Intelligence” as much when discussing the iPhone’s AI capabilities. “AI” will continue to mean artificial intelligence for all of us.
That said, what’s clear from all these reports is that Apple is taking AI very seriously and is determined to make the most of it in the coming years. That includes developing some futuristic new products with Apple Intelligence at the core, and I’m already excited about three of them in particular.
Bloomberg’s Mark Gurman detailed some of Apple’s “potentially game-changing hardware” in his Power On newsletter.
Tech. Entertainment. Science. Your inbox.
Sign up for the most interesting tech & entertainment news out there.
Email: SIGN UP
By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.
The list involves robots that can follow you around the home to help with chores and even humanoid robots that might feature an “advanced AI engine at its core.” Moreover, Gurman mentioned a tablet-top robot powered by the smarter version of Siri that’s coming with iOS 18. “That device will be primarily controlled by voice — and having a more precise and sophisticated version of Siri will be invaluable,” Gurman wrote.
As exciting as robotics might sound, I’m actually more interested in three other Apple products that Gurman mentioned.
AR glasses
I’m a big Vision Pro fan, though my excitement has tempered down somewhat. I’m not so sure about buying the device when it eventually makes it to Europe. But I do think the Vision Pro is the gadget we need for Apple’s AR glasses to become real. Also, I repeatedly said that the spatial computer absolutely needs AI features, including advanced voice capabilities.
Gurman says in his report that the Vision Pro already uses AI “to handle some capabilities, like understanding its external surroundings. And those features could help pave the way for AR glasses or camera-equipped AirPods.”
I’ll remind you something Apple has often said and will probably continue saying. AI isn’t just about having ChatGPT built into iOS 18 or the iPhone getting all sorts of generative AI features. Machine learning is also AI, powering various features inside Apple’s products. But AR glasses will also need genAI features and the smarter Siri.
AR glasses might one day replace the iPhone. Before we get there, they could be an iPhone accessory, and AI is a key development for making that happen.
AirPods with cameras
As you can see above, Gurman also mentioned AirPods with cameras in the report. The AI must hear and see the world around you to provide helpful assistance. That’s why ChatGPT’s GPT-4o upgrade is so exciting. The new model can interpret voice, images, and videos in addition to text.
Putting cameras in the AirPods would give the AI eyes before the AR glasses arrive. You won’t need to take out your iPhone or wear a different wearable (like the Humane Ai Pin).
Interestingly, reports say that OpenAI’s Sam Altman might be working on earphones with cameras that would give ChatGPT a hardware home. I wondered recently whether earphones with cameras are going to be the iPhone of AI that Altman is reportedly developing with Jony Ive, especially because Apple is also supposedly creating a similar product.
Apple Watch with blood sugar sensor
Apple has been working for years on Apple Watch tech that will let the wearable perform blood sugar readings non-invasively. These rumors preceded the arrival of ChatGPT-type AI for several years. Samsung is also working on similar tech for its wearables.
As a longtime Apple Watch user, the ability to measure blood sugar continuously would be a game changer. It’s the most important health feature I want from future wearables. And Gurman explains we’ll need AI for that to happen:
The new health features, like the coach and noninvasive glucose sensing, also would likely rely heavily on AI processing.
Apple needs sensors to pick up information from the bloodstream beneath the user’s skin. Advanced machine learning algorithms (or AI) would interpret the data. Gurman also mentions that Apple already achieved key milestones in its project to bring blood sugar monitoring to the Apple Watch. But it’s unclear how long it’ll take until Apple can deploy such a device.
The Apple Watch can currently help you monitor blood sugar, but only if paired with an accessory. Dexcom’s G7 is one such device that beams blood glucose readings directly to the Apple Watch, not just the iPhone.
The post 3 futuristic products that’ll be powered by ‘Apple Intelligence’ AI appeared first on BGR.