I think the 'white' part is perhaps given too much importance in this headline. I think the more interesting thing is that study participants did _worse than chance_ in distinguishing AI and human faces when shown an even mix. The authors discuss a theory where machine-generated faces are more 'average' and 'less memorable', and the real human face distribution has higher variance.
> Writing in the journal Psychological Science, the team describe how they carried out two experiments. In one, white adults were each shown half of a selection of 100 AI white faces and 100 human white faces. The team chose this approach to avoid potential biases in how own-race faces are recognised compared with other-race faces.
> The participants were asked to select whether each face was AI-generated or real, and how confident they were on a 100-point scale.
> The results from 124 participants reveal that 66% of AI images were rated as human compared with 51% of real images.
> Writing in the journal Psychological Science, the team describe how they carried out two experiments. In one, white adults were each shown half of a selection of 100 AI white faces and 100 human white faces. The team chose this approach to avoid potential biases in how own-race faces are recognised compared with other-race faces.
> The participants were asked to select whether each face was AI-generated or real, and how confident they were on a 100-point scale.
> The results from 124 participants reveal that 66% of AI images were rated as human compared with 51% of real images.