That risk has taken on a new dimension. It no longer depends solely on what you show, but also on what technology can infer. The images you post may hide valuable information, such as where they were taken. And with the advent of AI models that can analyze and reason with photos, the stakes are even higher.
OpenAI’s new models can tell where you took a photo. o3 and o4-mini have taken visual reasoning to a new level. They can analyze images with impressive accuracy and combine that ability with web search and image editing tools to refine their answers.
This allows them to explain things better than an instruction manual, or help you understand a complex blueprint. But it also opens the door to uses that should give us pause.
New viral trend. One of the latest trends on platforms like X has nothing to do with creating Ghibli-style images or Lego-style compositions. Now, many people are using these models to identify the exact location where a photo was taken—even if it doesn’t include metadata (EXIF data).
Just tell the model to play GeoGuessr, and it will start analyzing the image, cropping details, looking for matches and drawing conclusions. In one of our tests, the system identified a specific street in Chicago from a simple screenshot.
This is a feature that should make you think. In today’s hyper-connected environment, where photos are constantly being shared, it’s worth remembering that you don’t need to explicitly geolocate an image for someone else to figure out where it was taken.
AI models have increased the level of exposure—without many people realizing it. While this capability has promising applications, it also poses serious risks. Privacy depends not only on what we share, but also on what others can infer from it.
Images | Matheus Bertelli | Chris Dickens (Unsplash)
Related | o4-mini Is Much More Than Just Another AI Model. It’s OpenAI’s Tesla Model 3
The post Some People Are Using OpenAI’s o3 and o4-mini to Find the Location of Photos. It’s a Privacy Nightmare appeared first on Xataka On.