Danielle Lee Tomson, the research manager for election rumors at the University of Washington Center for an Informed Public, discusses the impact of AI on the 2024 campaign. She explains that AI is being used to create an ambiance online, influencing the way people feel rather than focusing on the veracity of the information. With just days before the Nov. 5 election, Tomson talks about the role of AI, social media, and trends in the spread of rumors online.
The Center for an Informed Public (CIP) was founded to inform the public about specialized knowledge. Tomson’s team of election rumor researchers is just one part of the center’s work. They focus on different verticals, such as misinformation day as a media literacy project, linguistics research, and demystifying psychology research. Tomson emphasizes the importance of communicating research to the public to help them understand the information being shared.
Tomson’s work at CIP focuses on election rumors, explaining that rumors can be true, false, or somewhere in between. She discusses the natural process of trying to make sense of reality and points out how misinformation can twist or repackage pieces of truth to create false narratives. This holistic approach to misinformation is key to combating the spread of false information.
The state of fact-checking, trust, and safety on social media platforms is also discussed. Tomson highlights that many platforms have faced layoffs, including trust and safety teams, which has impacted their ability to monitor and control the spread of misinformation. The discussion touches on the limitations of social media platforms in regulating political content and the balance between user engagement and promoting accurate information.
The unexpected impact of AI on the campaign is another point of discussion. Tomson explains that while deepfakes and false information generated by AI are a concern, the focus is shifting towards creating an ambiance and influencing feelings online. She highlights the example of AI-generated images of political candidates in contexts that are clearly false but create a sense of kinship or imaginative play among users.
The conversation doesn’t end with the Nov. 5 election, as Tomson mentions the importance of monitoring conversations related to election processes and procedures well past election day. She anticipates that there will be continued rumoring and sense-making across the political spectrum as people try to understand what happened and what is true. Overall, the discussion sheds light on the evolving role of AI in shaping the political landscape and the challenges in combating misinformation online.