New! Sign up for our free email newsletter.
Science News
from research organizations

Can you tell AI-generated people from real ones?

Research shows survey participants duped by AI-generated images nearly 40 per cent of the time

Date:
March 6, 2024
Source:
University of Waterloo
Summary:
If you recently had trouble figuring out if an image of a person is real or generated through artificial intelligence (AI), you're not alone. A new study found that people had more difficulty than was expected distinguishing who is a real person and who is artificially generated.
Share:
FULL STORY

If you recently had trouble figuring out if an image of a person is real or generated through artificial intelligence (AI), you're not alone.

A new study from University of Waterloo researchers found that people had more difficulty than was expected distinguishing who is a real person and who is artificially generated.

The Waterloo study saw 260 participants provided with 20 unlabeled pictures: 10 of which were of real people obtained from Google searches, and the other 10 generated by Stable Diffusion or DALL-E, two commonly used AI programs that generate images.

Participants were asked to label each image as real or AI-generated and explain why they made their decision. Only 61 per cent of participants could tell the difference between AI-generated people and real ones, far below the 85 per cent threshold that researchers expected.

"People are not as adept at making the distinction as they think they are," said Andreea Pocol, a PhD candidate in Computer Science at the University of Waterloo and the study's lead author.

Participants paid attention to details such as fingers, teeth, and eyes as possible indicators when looking for AI-generated content -- but their assessments weren't always correct.

Pocol noted that the nature of the study allowed participants to scrutinize photos at length, whereas most internet users look at images in passing.

"People who are just doomscrolling or don't have time won't pick up on these cues," Pocol said.

Pocol added that the extremely rapid rate at which AI technology is developing makes it particularly difficult to understand the potential for malicious or nefarious action posed by AI-generated images. The pace of academic research and legislation isn't often able to keep up: AI-generated images have become even more realistic since the study began in late 2022.

These AI-generated images are particularly threatening as a political and cultural tool, which could see any user create fake images of public figures in embarrassing or compromising situations.

"Disinformation isn't new, but the tools of disinformation have been constantly shifting and evolving," Pocol said. "It may get to a point where people, no matter how trained they will be, will still struggle to differentiate real images from fakes. That's why we need to develop tools to identify and counter this. It's like a new AI arms race."

The study, "Seeing Is No Longer Believing: A Survey on the State of Deepfakes, AI-Generated Humans, and Other Nonveridical Media," appears in the journal Advances in Computer Graphics.


Story Source:

Materials provided by University of Waterloo. Note: Content may be edited for style and length.


Journal Reference:

  1. Andreea Pocol, Lesley Istead, Sherman Siu, Sabrina Mokhtari, Sara Kodeiri. Seeing is No Longer Believing: A Survey on the State of Deepfakes, AI-Generated Humans, and Other Nonveridical Media. Advances in Computer Graphics. CGI 2023., 2024; DOI: 10.1007/978-3-031-50072-5_34

Cite This Page:

University of Waterloo. "Can you tell AI-generated people from real ones?." ScienceDaily. ScienceDaily, 6 March 2024. <www.sciencedaily.com/releases/2024/03/240306003456.htm>.
University of Waterloo. (2024, March 6). Can you tell AI-generated people from real ones?. ScienceDaily. Retrieved April 27, 2024 from www.sciencedaily.com/releases/2024/03/240306003456.htm
University of Waterloo. "Can you tell AI-generated people from real ones?." ScienceDaily. www.sciencedaily.com/releases/2024/03/240306003456.htm (accessed April 27, 2024).

Explore More

from ScienceDaily

RELATED STORIES