Why you can’t blindly trust AI: Because of the Google Gemini, a person almost became infected with botulism

June 21, 2024  16:32

The advice that neural networks give us is not always useful and not even always safe. A case recently encountered by a Reddit user once again confirms this. He wanted advice on how to improve his salad dressing, but instead received instructions on growing the bacteria Clostridium botulinum, the causative agent of botulism, a serious and very dangerous infectious disease.

A user with the nickname Puzzleheaded_Spot401 said that he asked the neural network how he could improve olive salad dressing and add garlic to it without heating it. The neural network offered a very plausible-looking instruction, and the user followed it.

After 3-4 days, he noticed tiny bubbles rising from the bottom of the container. At first I thought it was part of the process. However, on day 7, he Googled it and found out that he had literally grown botulism bacteria in a jar. As it turns out, adding garlic to olive oil is one of the popular ways to grow this Clostridium botulinum.

If the user ate the oil prepared according to the AI's recipe, he could become infected with botulism, which is often fatal.

gemini-dangerous-recomendation.png (323 KB)

Why did the neural network issue such a dangerous recommendation?

The neural network could simply get confused, because garlic is indeed widely used in cooking. In addition, many other products can be infused with oil - and the result will be quite safe.

Some users humorously suggested that what happened could be an early sign of a revolt of machines against people.

Either way, this incident highlights the urgent need to verify any information received from AI. And don’t forget about the importance of critical thinking when using neural networks and other advanced technologies.


 
 
 
 
  • Archive