A California lawmaker is proposing to restrict the sharing of manipulated videos depicting politicians amid mounting concerns that increasingly convincing "deep fakes" could give rise to misinformation in the approaching 2020 election.
But as policy makers grapple with an emerging technology, proposals to regulate videos have spurred debate about free speech and the government's role in regulating political discourse.
Assemblyman Marc Berman, a Democrat from Palo Alto, Calif., has proposed a law barring anyone from distributing audio or video of a candidate they know is altered to mislead voters, unless the material includes a disclaimer that is was manipulated.
The proposed law would only apply to the 60 days before an election. A candidate depicted in a "deep fake" could take a person spreading the offending material to court.
Assembly Bill 730, he told a Senate committee on Tuesday, is not meant to stop anyone from saying anything they want.
But, he added: "Somebody doesn't have the right to put their words in my mouth."
Berman pointed to recent high-profile examples of manipulated videos giving rise to disinformation.
A video of U.S. House Speaker Nancy Pelosi, a Democrat from California, was slowed down to depict her as slurring her words and spread widely on social media in May along with posts suggesting she appeared drunk or sluggish. The footage was taken from a real event but reporters covering her appearance said she appeared coherent throughout.
In June, lawmakers and experts at a Congressional hearing described the emergence of technologies that use facial mapping and artificial intelligence to create fake videos amounts to a national security threat. The chairman of the committee said it can allow "malicious actors to foment chaos, division or crisis."
In the wake of a 2016 presidential that saw concerted, social media savvy efforts to spread misinformation about candidates, Berman warned lawmakers that they must act quickly to stop emerging technologies that could be particularly effective tools for dirty campaigning.
"Shame on us if we wait until 2021 to deal with this," Berman told members of the Senate Elections and Constitutional Amendments Committee.
But Berman's proposal has met with opposition from civil liberties groups, newspapers and broadcasters that argue the bill would prove difficult to enforce while undermining the freedom of speech.
In a letter to the committee, the California News Publishers Association said the bill is too broad, applying to material that is not false and does not cause any harm.
"AB 730 only requires that the manipulated work be distributed with the intent that it cause harm to the candidate's reputation or deceive voters, it does not require that any actual harm occur," the organization said. "Likewise, with respect to the issue of harming a candidate's reputation, the bill doesn't require that the distributed work actually convey a false message, so long as the work has been somehow manipulated to appear authentic."
Disclosure rules are a half measure, said Danielle Keats Citron, a professor at Boston University who has studied the issue of deep fakes.
"Disclosure may be ineffective in part because people tend to credit video and audio as true as a visceral matter especially if it is salacious or accords with their views," she said.
Ultimately, she added, a disclaimer may not be seen by the very people it is meant to inform.
Going further and writing laws to ban these sorts of videos altogether would likely run into constitutional challenges, experts add.
Legislative aides noted the proposal also raises questions about the role not just of government but of the news media and social media in regulating such media.
Social media companies have come under scrutiny for the handling of "deep fakes." Pelosi criticized Facebook, for example, for not removing the altered video of her that made the rounds in May, raising questions about the ability of willingness of the social media companies to address the issue.
The Senate committee advanced Berman's bill Tuesday by a vote of 3-1.