A New Hampshire jury acquitted political consultant Steven Kramer on all charges related to sending AI-generated robocalls that mimicked President Biden’s voice to thousands of Democratic voters before the 2024 primary. The case represents one of the first major legal tests of how courts will handle AI-powered election interference, with implications for how similar cases might be prosecuted as artificial intelligence becomes more sophisticated and accessible.
What happened: Kramer, a 56-year-old New Orleans political consultant, admitted to orchestrating the robocalls sent two days before New Hampshire’s January 23, 2024, presidential primary.
- The AI-generated voice mimicked Biden’s speech patterns and used his catchphrase “What a bunch of malarkey”
- The message told voters: “It’s important that you save your vote for the November election. Your votes make a difference in November, not this Tuesday”
- Prosecutors alleged this suggested voting in the primary would prevent voters from casting ballots in November
- Kramer paid a New Orleans magician $150 to create the recording
His defense strategy: Kramer testified he wanted to send a “wake-up call” about AI dangers rather than suppress votes.
- “This is going to be my one good deed this year,” he recalled telling the court
- He argued the primary was a “meaningless straw poll” unsanctioned by the Democratic National Committee (DNC), making voter suppression laws inapplicable
- His defense claimed he didn’t impersonate a candidate because the message didn’t include Biden’s name and Biden wasn’t a declared primary candidate
- Kramer said he was getting frequent calls from people using AI in campaigns and worried about the lack of regulations
The verdict: Jurors acquitted Kramer of all 22 charges, which could have resulted in decades in prison.
- He faced 11 felony voter suppression charges, each punishable by up to seven years
- The 11 candidate impersonation charges each carried a maximum sentence of one year
Ongoing consequences: Despite the acquittal, Kramer still faces significant federal penalties.
- The Federal Communications Commission (FCC) imposed a $6 million fine, though Kramer told the Associated Press he won’t pay it
- Lingo Telecom, the company that transmitted the calls, agreed to pay $1 million in a settlement in August
Regulatory landscape: The case highlights the evolving challenge of regulating AI in political campaigns.
- New Hampshire Attorney General John M. Formella said the state remains committed to “enforcing election laws” and addressing “challenges posed by emerging technologies”
- Many states have enacted legislation regulating AI deepfakes in political campaigns
- However, House Republicans recently added a clause to their tax bill that would ban states and localities from regulating artificial intelligence for a decade
- The FCC was developing AI-related rules when Donald Trump won the presidency, but the agency has since shown signs of possibly loosening regulations
New Hampshire jury acquits consultant behind AI robocalls mimicking Biden on all charges