Campaigns prepare for the explosion of artificial intelligence in 2024

Campaigns prepare for the explosion of artificial intelligence in 2024 scaled | ltc-a

The conference, which was attended by a POLITICO journalist, was not a gathering of Luddites. Most were clear about the need to train campaign staff on how to use the technology. But operators have stressed the need to teach voters how to identify AI-fueled disinformation and disinformation.

“Information sources that essentially work to have true information about the world will go from being nice to have to being absolutely indispensable,” added Dennis. “I hope we are preparing for this at the platform level. I hope voters are educating themselves. »

Earlier this year, the board of directors of the American Association of Political Consultants unanimously condemned the use of deep fakes in political advertising. “The use of generative ‘deep fake’ [AI] the content is a dramatically different and dangerous threat to democracy,” Becki Donatelli, a Republican digital consultant and AAPC chair, said in a statement at the time.

To the untrained eye, it’s not always easy to tell if an image is fake. Take, for example, a campaign video Ron DeSantis posted to Twitter showing seemingly fabricated images of former President Donald Trump embracing Anthony Fauci. There is currently no federal requirement to include a disclaimer in campaign ads when AI is used to create imagery, although such bills have been introduced in Congress. Washington state recently passed into law legislation that would require a disclosure when artificial intelligence is used in campaign ads.

Last month at CampaignTech East, a gathering of digital campaign experts, Federal Election Commission commissioners offered insights into regulating AI in campaigns. Democratic Commissioner Shana Broussard suggested so current FEC political ad disclosure rules it could be adapted to include artificial intelligence, while Republican commissioner Trey Trainor said he was in favor of « as little regulation as possible. »

Democratic operatives at the Arena meeting said they were skeptical of industry-wide regulation going ahead of the 2024 election. Betsy Hoover, co-founder of Higher Ground Labs, a progressive political technology incubator, called it a  » strategic move” for Democratic campaigns to take the lead in setting guidelines when AI is used in voter outreach.

AI can have many practical uses like automating some of the more tedious and time consuming tasks required in campaigns. It can fill the gaps for smaller, low-vote operations that don’t have the resources to, say, generate multiple versions of a graph or analyze data, Hoover said.

American Bridge’s Dennis said that if campaigns are fabricating footage, it could open the door for lawsuits. But he’s less concerned if they « use AI to do the same kind of attack stuff they’ve always done, only faster and cheaper. »

But the potential for disinformation and disinformation underscores the need for humans on staff, Hoover said. Campaigns, he said, should think about how their programs are prepared to respond to a « less stable media environment. »

« People will seek out trusted messengers more than ever, » he said. « Likewise in 2016 and 2020, we started saying, ‘Okay, we need to invest in influencer messaging and relationship organization,’ those things become more important in this cycle. »