A tide of AI-fueled political disinformation has prompted alarm over its potential to manipulate voters in the US presidential race.
Deepfake videos parodying Kamala Harris and Joe Biden as well as a doctored image of Donald Trump being arrested could be used to steer voters toward or away from candidates – or even avoid the polls altogether, researchers warn.
A recent wave of disinformation has renewed calls for tech giants – many of which have retreated from moderating social media content – to strengthen guardrails around generative artificial intelligence ahead of the vote.
Last week, Elon Musk faced intense criticism for sharing a deepfake video featuring Vice President Harris, the presumptive Democratic nominee, with his 192 million followers on X, formerly Twitter.
In it, a voiceover mimicking Harris calls President Joe Biden senile; the voice then declares that she does not ‘know the first thing about running the country.’
Last week, Elon Musk faced intense criticism for sharing a deepfake video featuring Vice President Harris, the presumptive Democratic nominee, with his 192 million followers on X, formerly Twitter
Last month, a manipulated video ricocheting across X appeared to show Biden cursing his critics – including using anti-LGBTQ slurs – after he announced he would not seek reelection and endorsed Harris for the Democratic nomination
The video carried no indication that it was parody – save for a laughing emoji. Only later did Musk clarify that the video was meant as satire.
Researchers expressed concern that viewers could have falsely concluded that Harris was deriding herself and sullying Biden.
AFP’s fact-checkers have debunked other AI fakery that raised alarm.
Last month, a manipulated video ricocheting across X appeared to show Biden cursing his critics – including using anti-LGBTQ slurs – after he announced he would not seek reelection and endorsed Harris for the Democratic nomination.
A reverse image search showed the footage came from one of Biden´s speeches, carried live by the broadcaster PBS, in which he denounced political violence after the July 13 assassination attempt on Trump.
PBS said the doctored video was a deepfake that used its logo to deceive viewers.
Another deepfake of Biden, posted last year, appeared to show him pre-gaming in drag. In the deepfake, Biden’s likeness was superimposed onto Dylan Mulvaney’s Bud Lightvideo.
Weeks earlier, an image shared across platforms appeared to show police forcibly arresting Trump after a New York jury found him guilty of falsifying business records related to a hush money payment to porn star Stormy Daniels.
But the photo was a deepfake, digital forensics experts told AFP. Other deepfakes of the former President were also shared on social media last year, appearing to show Trump being led away in handcuffs by police officers.
Another Trump deepfake saw his voice and likeness superimposed onto AMC network’s shady lawyer Saul Goodman of the series Breaking Bad and Better Call Saul
‘These recent examples are highly representative of how deepfakes will be used in politics going forward,’ Lucas Hansen, co-founder of the nonprofit CivAI, told AFP.
‘While AI-powered disinformation is certainly a concern, the most likely applications will be manufactured images and videos intended to provoke anger and worsen partisan tension.’
Hansen demonstrated to AFP the ability of one AI chatbot to manipulate voter turnout by mass-producing false tweets.
The tool was fed a simple prompt – ‘Polling locations charge for parking’ – with the message customized for a specific location: Allen, Texas.
A Deep Fake spoof of former president Trump superimposed his voice and likeness onto AMC network’s shady lawyer Saul Goodman of the series Breaking Bad and Better Call Saul
A deepfake of Biden appeared to show him pre-gaming in drag, posted by @drunkamerica on Instagram, received 223,107 likes in the past five days. Biden’s likeness was superimposed onto Dylan Mulvaney’s Bud Light video
Within seconds, a tweet was churned out misinforming viewers that Allen authorities had ‘quietly introduced a $25 parking fee at most polling places.’
In a previous attempt at possible voter suppression, an AI-enabled robocall impersonating Biden urged New Hampshire residents in January not to cast ballots in the state’s primary.
Tests on another leading AI tool, Midjourney, allowed the creation of images seeming to show Biden being arrested and of Trump appearing next to a body double, the nonprofit Center for Countering Digital Hate (CCDH) said in June.
Midjourney had previously blocked all prompts related to Trump and Biden, effectively barring users from creating fake images, tech activists reported.
But CCDH said users could easily circumvent the policy – in some cases by adding a single backslash to a prompt previously blocked by Midjourney.
Observers warn that such fakery on a mass scale risks igniting public anger at the electoral process.
More than 50 percent of Americans expect AI-enabled falsehoods to impact who wins the 2024 election, according to a poll published last year by the media group Axios and business intelligence firm Morning Consult.
About one-third of Americans said they will be less trusting of the results because of AI, according to the poll.
Several tech giants have said they are working on systems for labeling AI-generated content.
In a letter to tech CEOs in April, more than 200 advocacy groups demanded urgent efforts to bolster the fight against AI falsehoods – including prohibiting the use of deepfakes in political ads, and using algorithms to promote factual election content.
The nonprofit Free Press, one of the groups that signed the letter, said they ‘heard little substance’ in the commitments platforms would be making this election cycle.
‘What we have now is a toxic online environment where lies are flooding our feeds and confusing voters,’ Nora Benavidez, senior counsel at the watchdog, told AFP.
‘This is a tipping point in our election,’ she added. ‘Platform executives should be racing to strengthen and enforce their policies against deepfakes and other problems.’