In the past, getting tricked by fake cheat codes shared by word of mouth or on early web forums was common for gamers.
It’s been a while since games like Pokemon Yellow and Final Fantasy debuted, and AI now seems to have replaced the role once held by your misinformed sibling.
As noted by veteran gaming journalist Stephen Totilo in his Game File newsletter, Google’s faulty AI Overviews have been providing incorrect advice on “Trash Goblin,” an indie game from UK studio Spilt Milk. Players control a goblin searching for treasures in trash, which he cleans and sells.
Totilo explains that players can’t damage the trinkets during gameplay; they clean them by clicking their mouses. However, when the journalist asked Google if damage could occur, its AI falsely claimed otherwise.
“Yes, trinkets in Trash Goblin can be damaged while chipping away at the surrounding cruft,” the AI erroneously stated.
Making matters worse, the AI Overview then offered an inaccurate tip on avoiding damage that doesn’t actually happen in the game.
“You need to be careful when removing the cruft, especially the parts directly attached to the trinket, as hitting them could cause the trinket to break,” the AI incorrectly advised.
This is just one instance of AI Overviews getting facts wrong.
From falsely saying that 26-year-old indie artist MJ Lenderman has won 14 Grammys, suggesting parents smear feces on balloons for potty training, to recommending putting glue on pizza and creating bizarre idioms, these error-filled AI summaries are often comical and nonsensical.
While such instances can be amusing, they also signal potential for more serious mistakes, especially considering Google’s recent decision to incorporate health advice into AI Overviews. It’s worrying to think of the same feature that once claimed “elephants can fit in the palm of one’s hand” providing medical information.
More on Google AI: Local Restaurant Exhausted as Google AI Keeps Telling Customers About Daily Specials That Don’t Exist.


