Sophisticated genAI products need more than voice, like ChatGPT’s Advanced Voice Mode, to become helpful assistants. They also need more than just access to your computer screen. The AI will also need its own “eyes” to quickly look at things around you when asked for information.
You can do that right now by taking photos and uploading them to ChatGPT. Where Apple’s AI is concerned, the iPhone 16 will soon get Visual Intelligence. The feature lets you quickly turn on the phone’s camera to ask the AI questions. Then there’s Google with its Lens technology, currently infused with Gemini, to get answers about the world around you. But the best approach currently available to users is probably the Ray-Ban Meta glasses that feature built-in cameras so Meta AI can see the world around you.
While I’m not going to switch to Meta AI anytime soon to use these AI-centric smart glasses, I’d love for Apple to come up with a similar concept. Maybe it would be a precursor to the more sophisticated AR glasses that Apple will likely develop in the more distant future to replace the iPhone.
A new report says Apple is already exploring the possibility via internal focus groups that are part of an initiative called Project Atlas.
Tech. Entertainment. Science. Your inbox.
Sign up for the most interesting tech & entertainment news out there.
Email: SIGN UP
By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.
According to Bloomberg, Project Atlas involves enlisting Apple employees to provide feedback on smart glasses that are currently on the market.
“Testing and developing products that all can come to love is very important to what we do at Apple,” said a purported internal Apple email from Apple’s Product Systems Quality team. “This is why we are looking for participants to join us in an upcoming user study with current market smart glasses.”
Apple reportedly hosts such internal focus groups whenever it considers entering a new product category. This strategy avoids leaks that would likely result from conducting external studies.
As a longtime iPhone user and genAI fan, I would welcome a pair of Apple smart glasses that work like the Ray-Ban Meta.
A few days ago, I unexpectedly got access to ChatGPT Advanced Voice Mode and put the feature to the test almost immediately during a museum visit. At the time, I said it wasn’t a complete success due to mostly external factors. But I still viewed it as a great experience. I ended up uploading photos to ChatGPT to ask the AI questions once I turned off Advanced Voice Mode.
I would have loved for ChatGPT to have access to the phone’s camera in real time while the voice mode was enabled. That would have given me a personal tour guide ready to answer any question I could think of.
Smart glasses made for Apple Intelligence would certainly work well in that scenario. And I’d use them everywhere, not just in museum settings. I might have questions about things I see around me during everyday life, and AI could help.
Apple’s Atlas project is exciting for that reason. I will say the glasses will also need Apple Intelligence capabilities that can match ChatGPT. Siri would have to grow chatbot abilities and handle voice as well as GPT-4o. It’ll be a few more years until that happens, which could be enough time to see Apple smart glasses in stores. That’s assuming Apple is moving forward with it.
A previous report from Bloomberg also mentioned Apple’s interest in smart glasses. The report said such a device might be released as soon as 2027.
The same report also noted that Apple is considering making AirPods with cameras. That’s another type of product that would benefit Apple Intelligence, though I think I’d prefer glasses over earbuds with cameras.
In this week’s report about Atlas, Mark Gurman speculates that the smart glasses might actually be a version of AirPods. They’d feature better battery life, more sensors, and improved audio technology. I’ll say these features would be mandatory for getting Advanced Voice Mode-like experiences from Apple Intelligence.
I’d also want the glasses form factor from a company that made the Vision Pro spatial computer. No matter what Meta says about its own VR products, there’s nothing like it on the market.
I’d love it if I could incorporate the Vision Pro into my work environment. After trying it, I was incredibly impressed with what it can do, at least on the entertainment side of things. I also realized the drawbacks.
However, some of the tech Apple has developed for Vision Pro could be used in these AI-centric smart glasses. In a few years, these technologies would eventually make AR glasses possible, which would likely support AI assistants and Vision Pro-like computing.
Earlier this week, Gurman did say that Apple is considering cheaper Vision Pro headsets, including a version that can offload processing power to the iPhone. The smart glasses that Apple is considering would probably work similarly. They’d pass the data via the iPhone to Apple’s servers for Apple Intelligence processing.
All of that is wishful thinking at this point. But there’s no denying that the AI needs eyes to become a better assistant. And I’d rather these eyes would come from a company I trust to protect what those eyes see.
Finally, I’ll remind you of two other important developments in the AI-related hardware field. First, when Google demoed its Gemini voice features at I/O 2024, it gave the AI eyes via a Pixel phone, but also an unreleased pair of smart glasses. Google, Samsung, and Qualcomm are working together on some sort of head-word device, which has yet to see the light of day.
Separately, Jony Ive confirmed that he’s working with OpenAI on ChatGPT hardware without detailing the form factor.
I’m not saying that Google & Co. and OpenAI will release smart glasses of their own. But they must have these types of products on their radar.
The post Smart glasses powered by Apple Intelligence could be a total game-changer appeared first on BGR.