Written by 9:55 pm Generative AI, Uncategorized

### AI-Generated White Faces Deemed More Realistic than Actual Photos: Study Findings

‘Hyperrealism’ bias has implications in robotics, medicine, and law enforcement.

As per a report by The Guardian, a study published in the peer-reviewed book Psychological Science on Monday revealed that AI-generated faces, particularly those depicting white individuals, were perceived as more authentic than actual photographs of faces. Notably, the AI models predominantly trained on Caucasian individuals, reflecting a known bias in machine learning research, with no inclusion of images representing people of color.

The researchers from Australian National University, along with the Universities of Toronto, Aberdeen, and University College London, introduced the term “hyperrealism” in the study, defining it as a phenomenon where AI-generated faces are considered more genuine than real human faces.

During the experiments, white adults were presented with a combination of 100 AI-generated faces and 100 authentic white faces and were tasked with identifying the real faces and expressing their level of confidence in their choices. Surprisingly, 66% of the participants classified AI images as human, compared to 51% for real images. However, this trend was not observed in the case of images featuring people of color, as both AI-generated and real faces were generally recognized as human irrespective of the participants’ racial background.

In addition to utilizing images created by Nvidia’s StyleGAN2 picture generator, renowned for producing lifelike faces through image synthesis, the researchers incorporated real and synthetic images from a previous study for comparative analysis.

The study also highlighted the Dunning-Kruger effect, indicating that participants who consistently misidentified faces exhibited higher confidence levels in their judgments. Essentially, individuals with more self-assurance tended to make more mistakes in face recognition.

In a subsequent test involving 610 adults, participants were asked to evaluate AI and human faces based on various characteristics without being informed about the origin of the images. By leveraging the “face space” theory to pinpoint specific physical attributes, the researchers discovered that the misconception of AI faces being human stemmed from factors such as enhanced symmetry, familiarity, and reduced uniqueness. Consequently, the researchers argue that the attractiveness and typical features of AI-generated faces contributed to their perceived authenticity among the study participants, whereas the diverse facial proportions in real faces seemed less genuine.

Interestingly, despite the challenges faced by individuals in distinguishing between real and AI-generated faces, the researchers developed a machine-learning system capable of consistently differentiating between the two.

The study’s findings raise concerns regarding the perpetuation of societal biases and the interplay between competition and perceptions of “humanity.” This phenomenon could have implications in various domains, such as search efforts for missing individuals, where AI-generated faces are occasionally employed. Moreover, the general populace’s inability to discern artificial faces may potentially lead to fraudulent activities or identity theft.

Dr. Zak Witkower, a co-author from the University of Amsterdam, highlighted to The Guardian the substantial implications of this trend across diverse fields, including virtual interventions and technologies. He noted that the outcomes may vary significantly for individuals of different races, particularly favoring those with lighter skin tones.

Dr. Clare Sutherland, another co-author from the University of Aberdeen, underscored the critical need to address biases in AI. She emphasized the importance of ensuring inclusivity and equity across all scenarios, irrespective of factors like race, gender, or age, as the rapid integration of AI is reshaping the world.

Visited 2 times, 1 visit(s) today
Last modified: February 8, 2024
Close Search Window
Close