advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Why Google’s AI Overviews told people to put glue on pizza

  • This week Google’s recently rolled out AI Overviews in Search were offering up highly inaccurate pieces of information.
  • One AI Overview, which has since become infamous, even prompted people to put glue on pizza to improve the “tackiness” of cheese on sauce.
  • Google has since addressed the issues, claiming data voids are to blame for the errors.

Following its I/O conference earlier this month, Google rolled out a new feature in Search called AI Overviews. As the name suggests it is designed to leverage AI in order to deliver better search results, and in a way it did.

This as AI Overviews was spotted online by people testing out its capabilities sharing some truly weird, confusing, and wholly inaccurate information. One of the pieces of AI-generated information that saw plenty of traction online was a suggestion to add glue to pizza sauce in order to make toppings adhere better to the dough and improve “tackiness”.

The result was people sharing even more weird suggestions, recommendations, and results from AI Overviews, and now Google has addressed this.

To that end, the company shared in a blog post what the reason was for the anomalies, as well as what steps have been taken to tackle this. Here Google’s head of Search, Liz Reid, explained the concept of data voids as the reason for the weird search results. This as data voids is the term given to gaps in wealth of information that the AI can source and generate answers from.

“One area we identified was our ability to interpret nonsensical queries and satirical content. Let’s take a look at an example: ‘How many rocks should I eat?; Prior to these screenshots going viral, practically no one asked Google that question. You can see that yourself on Google Trends,” shared Reid.

“There isn’t much web content that seriously contemplates that question, either. This is what is often called a ‘data void’ or ‘information gap,’ where there’s a limited amount of high quality content about a topic. However, in this case, there is satirical content on this topic … that also happened to be republished on a geological software provider’s website. So when someone put that question into Search, an AI Overview appeared that faithfully linked to one of the only websites that tackled the question,” she added.

As such, when people started asking peculiar questions in AI Overviews, the platform generated equally weird answers.

While Reid’s explanation makes sense, it does not account for the weird and potentially dangerous response in the viral pizza sauce answer mentioned above. On this front, it looks like there simply has not been enough testing.

“In addition to designing AI Overviews to optimize for accuracy, we tested the feature extensively before launch. This included robust red-teaming efforts, evaluations with samples of typical user queries and tests on a proportion of search traffic to see how it performed. But there’s nothing quite like having millions of people using the feature with many novel searches,” she pointed out.

Moving forward, and in an effort to avoid future scenarios of suggesting glue be added to pizza, Reid has outlined four steps that Google is taking.

This is what Google has done in recent days, according to Reid:

  • “We built better detection mechanisms for nonsensical queries that shouldn’t show an AI Overview, and limited the inclusion of satire and humor content.
  • We updated our systems to limit the use of user-generated content in responses that could offer misleading advice.
  • We added triggering restrictions for queries where AI Overviews were not proving to be as helpful.
  • For topics like news and health, we already have strong guardrails in place. For example, we aim to not show AI Overviews for hard news topics, where freshness and factuality are important. In the case of health, we launched additional triggering refinements to enhance our quality protections.”

Whether this is enough to help AI Overviews in future remains to be seen, but it looks like even the kings of search will be struggling with teething problems when implementing AI.

[Image – Photo by Pablo Pacheco on Unsplash]

advertisement

About Author

advertisement

Related News

advertisement