Purdue University doctoral student Christina Walker visited Elon University on Oct. 22 to discuss the impact of political deepfakes on social media users and how labeling could help stop the spread of misinformation during her event “Watering Down Watermarks: The Effectiveness of Labeling Deepfakes.”

“Media literacy and digital literacy is rapidly changing,” Walker told Elon News Network. “As new and better models come out, it becomes harder — even for experts — to tell these things apart and to try to get the full picture from yourself.”

Deepfakes are a form of artificial intelligence that can add someone’s face and voice to another user. Walker and her team capture information on who shares these systems on social media and tracks its original source when available. 

The main goal of the political deepfake database is to track media users’ understanding of a specific watermark for AI-generated content, which signals the material posted is fake. Photos and videos are observed, then coded by the database’s team depending on if they are able to verify the content from news sources or other databases. 

“Deepfakes don't always have an inherently malicious intent,” Walker said. “There are discussions to be had. It’s hard to make super firm policies on these things right now because they are rapidly changing. We are still figuring out the costs and benefits of these things.”

Along with the database, Walker and her team have conducted research through oTree — an open-source research platform. Using self-reported, bipartisan participants, people engage with randomly selected social media posts — both authentic and artificial. Participants are given a survey prompting them to reflect on the activity, and they are given an AI literacy sheet that highlights the authenticity of each post they interacted with. The researchers captured what the participant clicked on to further their search within the provided post.

Monika Jurevicius | Elon News Network
Purdue University doctoral student Christina Walker hosts "Watering Down Watermarks: The Effectiveness of Labeling Deepfakes" on Oct. 22 at Elon University. The event was held to discuss the impact of political deepfakes on social media users and how labeling them could help stop the spread of misinformation.

Walker’s research started in collaboration with other Purdue and Emory University researchers in June 2023. Most internet observations of deepfakes are sourced from X, previously Twitter. Currently, they are waiting to conduct the full experiment until after elections.

The Turnage Family Faculty Innovation & Creativity Fund provides grants for political researchers to conduct work on political communication in a changing digital landscape. Elon political science professor Barris Kesgin said the fund is “coming back from hiatus” since the COVID-19 pandemic but felt it was important to bring Walker’s research on campus with a new technological innovation like AI.

Kesgin also pointed to Dave Turnage, whose contributions led to the development of Elon University’s fund focused on the study of political communication and media literacy, as someone who pioneered the future of technology in communications. Turnage died in 2022 at 90 years old.

“Dave Turnage had in mind how the traditional forms of news media were transitioning into the digital space,” Kesgin said. “We’re in the early stages. The Turnage Fund allows us to have these conversations and hear about experts and their research program.”

Walker was student editor of the Pi Sigma Alpha Undergraduate Journal of Politics at Oakland University, which was the journal's previous host and where she met Kesgin. Kesgin is a member of the journal’s executive council and the current journal editor; Elon University is currently the journal’s host institute for the fifth and final year.

Elon University released a guide on AI usage this September to help students better educate students on how to work alongside AI. Kesgin said he is not an expert in the field of politics and AI but increased use of AI brings many opportunities for future research.

“I am motivated by adapting the emergence of these new resources and how best we can make use of them. Not randomly, not in an uneducated manner, but knowing what is available to us and how we can utilize it,” Kesgin said. “I think the human voice — human side — is necessary.”