Death has not erased the decades-long rivalry between two Indian leaders: both have now seemingly risen from the grave in digital form to rally their supporters ahead of national elections.
Political parties use powerful artificial intelligence tools to create deepfakes and reproduce famous faces and voices in ways that often appear authentic.
Both the government and campaigners have warned that the proliferation of such tools poses a dangerous and growing threat to the integrity of India’s elections.
As a marathon six-week general election begins on April 19, so-called “ghost appearances” – the use of dead leaders in videos – have become a popular form of campaigning in the southern state of Tamil Nadu.
Actress and politician J. Jayalalithaa died in 2016 but was mentioned in a voice message that was deeply critical of the state’s current ruling party, once led by her arch-rival M. Karunanidhi.
Recycling “very charismatic” speakers offers a novel way to grab attention, said Senthil Nayagam, founder of Chennai-based Muonium, which created the AI video purportedly depicting Karunanidhi.
Resurrecting dead leaders is also a cost-effective campaign method compared to traditional rallies, which are time-consuming to organize and expensive to run for voters accustomed to big spectacle.
“Bringing crowds together is a difficult thing,” Nayagam told AFP. “And how often can you do a laser or drone show?”
Very thin line
Prime Minister Narendra Modi’s Bharatiya Janata Party (BJP) was a keen early adopter of technology during the election campaign.
In 2014, the year he came to power, the party expanded Modi’s campaign reach by using 3D projections of the leader to appear virtually at rallies.
But the use of technology that can clone a politician’s voice and create videos that appear so real that voters have a hard time distinguishing reality from fiction has naturally raised concerns.
Ashwini Vaishnaw, the communications minister, said in November that deepfakes were “a serious threat to democracy and social institutions.”
AI inventor Divyendra Jadoun said he received a “huge wave” of requests for content from his company The Indian Deepfaker.
“There is a lot of risk in this upcoming election and I’m pretty sure a lot of people are using it for unethical activities,” the 30-year-old said.
Jadoun’s repertoire includes voice cloning, chatbots and mass distribution of finished products via WhatsApp messages, instantly sharing content with up to 400,000 people for 100,000 rupees ($1,200).
He insisted he declined offers he didn’t agree with, but said it was a “very fine line” to determine whether a request for his services was unethical or not.
“Sometimes even we get confused,” he added.
Jadoun said the rapidly advancing technology was little understood in a “large part of the country” and AI products were believed by many to be true.
“We just tend to fact-check videos that don’t align with our preconceived notions,” he warned.
Threat to democracy
Most AI-generated campaign material so far has been used to taunt rivals, particularly through song.
This week, a leader of the BJP’s youth wing released an AI-generated video of Arvind Kejriwal, a leading Modi opponent who was arrested in a bribery probe last month.
It shows him sitting behind bars, playing the guitar and singing a verse from a popular Bollywood song: “Forget me, because you have to live without me now.”
Elsewhere, digitally altered videos purport to show lawmaker Asaduddin Owaisi, one of India’s most prominent Muslim politicians, singing Hindu devotional songs.
A caption alongside the video on Facebook jokes that “anything is possible” if Modi’s party, known for its Hindu nationalist politics and accused of discrimination against India’s Muslim minority, wins again.
Joyojeet Pal, an expert on the role of technology in democracy at the University of Michigan, said that ridiculing a political opponent is a more effective campaign tool than “calling him a criminal or a crook.”
Mocking opponents in political cartoons is a centuries-old tactic, but Pal warned that AI-generated images could easily be misconstrued as real.
“It is a threat to what we can and cannot believe,” he said. “It is a threat to democracy as a whole.