The benefits of smoking in children and other glaring errors of AI in Google Search

After a year of testing with a limited number of users, Google launched last week in the United States AI Overviews or AI-generated summaries that precede traditional search results. These summaries are already being described as a “disaster” by users who report strange AI responses that include advising adding glue to a pizza or promoting the benefits of smoking for children.

A widely shared screenshot shows Google's AI-generated response to the query 'cheese that doesn't stick to the pizza'. The search engine responded by listing 'some things you can try' to avoid the problem. “Mixing cheese into the sauce helps add moisture to the cheese and dry out the sauce,” noted AI Overviews. 'You can also add about 1/8 cup of non-toxic glue to the sauce to give it more stickiness'.

Where does such a response come from? Some users investigated it and found Its origin in a Reddit post from 11 years ago that had 8 votes in favor. Last February, Google reached an agreement with the popular Internet forum to train its AI with the platform's data.

'Google's AI search results are a disaster. I hate that it is becoming a resource that can no longer be trusted,' said Tom Warren, senior editor at The Verge, about this case. The pizza sauce tip is just one of numerous examples of AI Overviews' strange behavior circulating on social media.

A search on Which US presidents attended the University of Wisconsin-Madison yielded equally erroneous information from Google. The search engine stated that the former president Andrew Johnson, who died in 1875had obtained 14 degrees from the school and had graduated in 2012 and John Kennedy, assassinated in 1963, in 1993, among other errors.

thank you google ai, very cool

(image or embed)

— illumi ( May 22, 2024 at 15:44

Another example showed results for the search 'health benefits of tobacco for children'. Google's AI responded by stating that 'tobacco contains nicotine, which can cause some short term benefits, such as greater alertness, euphoria and relaxation'. He also stated that possible uses of tobacco include 'whiten teeth', among others.

A Google spokesperson said: 'The examples we have seen are generally uncommon queries and are not representative of most people's experiences'.

'The vast majority of AI summaries provide high-quality information, with links to delve deeper into the web. Our systems aim to automatically prevent content from appearing that violates AI Overviews policies. If content appears that violates policies, we will take appropriate action', a Google spokesperson said in a statement.

Some users have also found that the AI ​​seems to get confused with the searches they request from Google. convert 1,000 kilometers into its equivalent with a specific object. For example, a user's search for '1,000 km in oranges' received another crazy response from Google: 'One solution to the problem of transporting oranges 1,000 kilometers is to feed a horse 2,000 oranges at a time, one for each kilometer traveled, and sell the remaining 1,000 oranges on the market.'

In another example shared on networks, Google responded to the search 'how to treat appendicitis pain at home', with using boiled mint leaves and a high fiber diet. 'Correct me if I'm wrong, but Isn't there no such thing as a home remedy for appendicitis? Google AI suggests mint leaves', pointed out the user.

Google includes a disclaimer in all of its AI Overviews answers stating that 'Generative AI is experimental'. The company assures that it has placed safeguards in the system to ensure that no harmful content appears in the results and does not use AI Overviews in searches for potentially explicit or dangerous topics. Google plans to make AI Overviews available to 1 billion users by the end of the year.

Google came under fire last February after its AI image generation tool Gemini will show historically inaccurate images, like Nazis or black popes. Google disabled the feature and apologized, saying it would fix it. but the capability has not yet returned to the chatbot.