A panel of experts delved into the impact of artificial intelligence on the upcoming elections last month at a discussion sponsored by the Office of Diversity, Inclusion, and Intercultural Initiatives (DI3).
From left: Musa al-Gharbi, Paige Lord, Thomas Costello, Klaus Mueller and Karim Boughida at the “Democracy in the Digital Age: AI’s Influence on the 2024 Elections” event.
Panelists for “Democracy in the Digital Age: AI’s Influence on the 2024 Elections” included artificial intelligence ethicist Paige Lord; Thomas Costello, assistant professor of psychology at American University; Klaus Mueller, professor in Stony Brook University’s Department of Computer Science and interim chair of the Department of Technology and Society at Stony Brook; and Musa al-Gharbi, assistant professor in Stony Brook’s School of Communication and Journalism. The discussion was moderated by Karim Boughida, dean of University Libraries.
Al-Gharbi said the root of the discussion might be in the simple realization that others have different ways of processing information. “There’s a temptation nowadays to assume that people who disagree with us, who vote the way that we don’t think they should, who hold views that we don’t agree with, are being subject to misinformation, or they’re being manipulated,” he said. “I don’t think that that’s helpful, and I don’t think it’s an accurate understanding of the social world.”
Al-Gharbi’s research indicates that people are not very prone to being manipulated by other people and not easily brainwashed. In fact, he said many are aware that the thing they’re saying may be true in a broad sense, but not literally true.
“There are a lot of beliefs like this where people are sending social signals,” he said. “So fake news doesn’t really change people’s behavior as much as people think. It mostly serves up content to people who want to affirm what they already believe.”
Costello spoke of the power of AI to change people’s minds. He pointed to a recent paper where people were asked to write about a conspiracy theory they believe while offering facts and evidence to supports that conspiracy, and then enter into a conversation with the large language model ChatGPT-4, trying to debunk the conspiracy or persuade them using facts and evidence. “After about an eight-minute conversation, you get a 20-percent decrease in that person’s conspiracy on average,” said Costello. “And in about one in four cases, people disavowed their original conspiracy entirely. That suggests that these AI models have real, meaningful persuasive capabilities.”
Lord began her work in AI in 2016 shortly after completing her undergraduate studies, noting what was just theory has changed. “My focus now goes toward debunking some of the claims that are being made, making sure that when I talk about artificial intelligence with regard to politics that it’s not sensational, and maybe I can help people to understand why there are nefarious uses of artificial intelligence in the election and what some of the motives might be.”
Mueller explained his work using data visualization and interactive interfaces to help people understand data better. He said people form personal causal networks where gaps can allow a misconception to take hold. His research focuses on how people form those causal networks and in finding ways to appeal to them individually, using large language models like ChatGPT.
“For example, we can ask these models to impersonate people who live on the North Carolina coast or Tampa to find out what they think the causal chains underlying climate change are,” he said. “And then, based on the gaps we find, we can try to define messages to inform them better and maybe change behaviors that are not in their own best interests.”
Unfortunately, this formula can also be used for more nefarious purposes. “It’s a scary moment,” said Mueller. “We need to make sure that this engine is not misused and only presents truth and facts. Like every new technology, there is always the risk of misuse.”
Al-Gharbi said ChatGPT moves people in a certain way because it’s a dialogic thing that builds a rapport and is not just a generic message. Costello furthered the point, noting that conspiracy theories force arguments that are inherently analytic and related to facts, and AI discussions force people to think rationally. He also said that the fact that AI is not a person may actually increase its ability to sway opinion.
“It’s easier to have belief change when someone comes to the conclusion themselves rather than having it forced upon them,” Costello said. “And then there’s the fact that some of these arguments and disagreements become a defense of your dignity and honor. Taking that out of the equation with AI, I think that could be an important mechanism in what’s going on here. But we haven’t explored that.”
Lord said the challenge is not just the ease of leveraging AI, but what humans do with the message. “People can now just take their thoughts and broadcast them into the world with minimal skill,” she said. “Back in the day if somebody wanted to create a fake magazine cover, they’d have to be a skilled designer. Now all it takes is somebody with an internet connection and some time.”
So what does the future hold?
“The honest answer is ‘I don’t know,’” said Costello. “Things are going to get really bizarre in a fundamental way. But it’s almost impossible to make a more substantive prediction.”
Al-Gharbi referred to periods of awakening where growing numbers of people get suspicious about whether or not knowledge economy professionals are telling the truth. “It’s going to be important for mainstream knowledge professionals to really take seriously the fact that a lot of people in America don’t perceive themselves to have a voice and that leads them to alternative explanations.”
“It’s important for us to remember that there are moments throughout history where entire groups of people have not been able to trust the people around them,” said Lord. “I think we have a moment right now where we can kind of see that coming, and we can figure out how it will affect us and what we want to do.”
Read more on SBU News.