DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home Tech Apps

Google AI Overviews’ embarrassing new problem is even funnier than pizza glue

April 24, 2025
in Apps, News, Tech
Google AI Overviews’ embarrassing new problem is even funnier than pizza glue
495
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Ever heard your grandma tell you that “two buses going in the wrong direction is better than one going the right way”? Of course not, because that’s not an actual old saying. Nobody says that unless you’re Google’s sophisticated Gemini-powered AI Overviews feature that sits atop Google Search results in some markets.

AI Overviews doesn’t just think this saying about the buses is real. It actually hallucinates an explanation for the fake idiom, trying to reason its way through it. This is another significant embarrassment for Google’s Search-related AI plans. The good news is that it’s not as dangerous as suggesting putting glue on pizza to keep your cheese from falling off.

Apparently, you can give Google Search your fake idioms, and the AI will take them at face value and attempt to explain them. Here’s what the fake bus saying above means, according to an AI Overview that Tom’s Guide got:

The saying “two buses going in the wrong direction is better than one going the right way” is a metaphorical way of expressing the value of having a supportive environment or a team that pushes you forward, even if their goals or values aren’t aligned with your own. It suggests that having the momentum and encouragement of a group, even if misguided, can be more beneficial than being alone and heading in the right direction.

“Never put a tiger in a Michelin star kitchen,” or “Always pack extra batteries for your milkshake,” are other hilarious examples of AI Overviews hallucinating explanations from Tom’s Guide roundup.

Similarly, Engadget found other examples of fake idioms that AI Overviews will explain. Here are a few other good ones: 

  • never rub your basset hound’s laptop
  • you can’t marry pizza
  • you can’t open a peanut butter jar with two left feet

Then there’s social media, which will not fail to come up with more fake sayings that AI Overviews has no problem explaining in plain terms: 

  • you can’t lick a badger twice
  • a squid in a vase will speak no ill
  • never spread your wolverine on Sunday
  • you can’t have a Cheeto in charge twice
  • never slap a primer on a prime rib
  • beware what glitters in a golden shower

I was in tears laughing while reading Google’s AI explanations for some of these.

Google has significantly improved AI Overviews since the glue-on-pizza disaster. More recently, Google added more health topics to AI Overviews, which is an important development. It means Google feels safe allowing AI Overviews to generate such sensitive information atop search results.

But I wouldn’t blame you for not trusting any health-related information AI Overviews gives you after seeing it hallucinate explanations for fake idioms.

Then again, every AI chatbot model under the sun hallucinates. No AI firm has fixed the problem, including Google. The best recent example comes from OpenAI’s own ChatGPT research, which reveals that the frontier o3 and o4-mini reasoning models released a few weeks ago will hallucinate more than their predecessors despite being better.

The difference is that Google chooses to put these AI Overviews atop Google Search, so you’re going to stumble into AI-powered blocks of information, which may be accurate or not, every time you perform a Google Search, whether you want it or not.

The good news is that AI Overviews isn’t enabled everywhere. For example, I’m not seeing them in the EU. Then again, I’ve long ditched Google Search for my main internet search jobs, so that further reduces the risk of running into AI Overviews hallucinations, bus-related or other kinds.

The post Google AI Overviews’ embarrassing new problem is even funnier than pizza glue appeared first on BGR.

Tags: GeminiGoogle
Share198Tweet124Share
How to Watch Dodgers vs Mets: Live Stream MLB, TV Channel
News

How to Watch Dodgers vs Mets: Live Stream MLB, TV Channel

by Newsweek
May 24, 2025

The Los Angeles Dodgers will face the New York Mets in a thrilling NLCS rematch on Saturday at Citi Field, ...

Read more
News

Kiss off: Pedro Pascal latest celebrity scold to dump on America

May 24, 2025
News

U.S. Man Who Lived Abroad With Family’s Nanny Is Charged in Wife’s Murder

May 24, 2025
News

Live-Action ‘Lilo & Stitch’ Soundtrack: All The Songs You’ll Hear

May 24, 2025
News

Cannes awards Palme d’Or to Iranian revenge drama ‘It Was Just an Accident’

May 24, 2025
Gaza pediatrician, mother loses 9 children after Israeli strike hits home: Hospital

Gaza pediatrician, mother loses 9 children after Israeli strike hits home: Hospital

May 24, 2025
Eric Adams beefs up re-election fundraising as Cuomo, Mamdani boast top coffers heading into Dem mayoral primary

Eric Adams beefs up re-election fundraising as Cuomo, Mamdani boast top coffers heading into Dem mayoral primary

May 24, 2025
Nvidia Eyes Chinese Market With New AI Chip Launch

Nvidia Eyes Chinese Market With New AI Chip Launch

May 24, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.