AI‑Generated Robocalls Mimicking Biden Spark Legal Fallout in New Hampshire

AI‑Generated Robocalls Mimicking Biden Spark Legal Fallout in New Hampshire

A political consultant sent out robocalls in New Hampshire that used AI to mimic Joe Biden’s voice — urging Democrats not to vote in the state’s 2024 primary. The calls were designed to sound authentic and used deep‑fake audio to impersonate the former president.

Although a jury later acquitted the consultant, Steve Kramer, on criminal charges of voter suppression and impersonating a candidate, a civil lawsuit by voters succeeded. A federal court ordered him to pay damages to three affected voters and imposed a nationwide ban on repeating such practices.

Kramer, however, has refused to pay the awarded amount — roughly US$22,500 — arguing that the civil ruling came only after his criminal acquittal. He also defied a separate $6 million fine issued earlier by the Federal Communications Commission for using caller‑ID spoofing and deep‑fake voice technology.

The case has become a landmark in the debate over AI misuse in political campaigns, illustrating how generative‑AI technology can be weaponized to mislead voters. Legal advocates say the ruling underscores the need for stronger safeguards and clear regulation to prevent AI‑driven election interference.

About the author

TOOLHUNT

Effortlessly find the right tools for the job.

TOOLHUNT

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to TOOLHUNT.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.