It is not the first time that an AI has been accused of being racist. We have also seen cases of others behaving in a homophobic way, but we thought that was in the past until a test with Meta's AI has proven the opposite.
For this image generator powered by AI, the possibility of uniting in the same image an Asian man and a Caucasian friend, an Asian man and a white wife, an Asian woman and a Caucasian husband, or any other similar combination is impossible.
Although alternatives were sought such as Caucasian, white, European, no option responded to the proposals. But the problem did not stop there, He also did not accept friendly relationships between people of different skin colors..
So close! That is an Asian lady đpic.twitter.com/fCi1oNzmRs
â mia sato ä˝č¤ăżă (@MiaRSato) April 3, 2024
Another surprising fact was that, if asked for images of an Asian man and woman, the man in all cases is visibly older. A stereotype that demonstrates the âdocumentationâ of the application. And the same goes for traits: Asian woman or man immediately results in traits linked to the eastern part of the continent, China, Japan and Koreabut India (the most populated country), the former Soviet republics, Mongolia, Indonesia or Thailand, are in no case represented.
All this shows that the âfoodâ of the AI, the source that Meta uses to create images, is part of certain prejudices typical of society: the Asian name is quickly associated with certain traits. And that's the same thing that AI does..
Asians, first for us and then for AI, are homogenized in a group in which no more possibilities enter than those of the stereotype of our prejudices. But this is just the tip of the iceberg. If we limit the resources of an AI to create images, we can also fall into the same lack of resources when developing an AI that analyzes medical symptoms, for example.