NEWS

Deep Fake Dangers: You Could Get Manipulated by Artificial Tech During the 2024 Election

Updated: May 8, 2024 at 11:16 am EST  See Comments

WASHINGTON, D.C. – Artificial intelligence might be used to spread disinformation this election season. One survey finds three-quarters of Americans are worried about the use of AI-generated deep-fakes to manipulate public opinion. 

From forged photos and doctored videos, to life-like robocalls, AI technology carries the power to deceive, manipulate, and disrupt. No longer considered science fiction, lawmakers worry this present reality poses a serious threat to our democratic republic, and we’re already seeing it play out.

Just days before New Hampshire’s January primary, nearly 20,000 voters received a call using an AI-generated voice of President Biden, attempting to discourage them from showing up at the polls.

“Voting this Tuesday only enables Republicans in their quest to elect Donald Trump again,” the AI-generated president said. 

Officials detected the deep-fake and alerted the public, but experts warn, these deceptions will only get more sophisticated and available.

“Imagine a world where that was a one-to-one attack. Where instead of it being pre-recorded, it was actually live, and instead of being from one to many, it was one-to-one, where it’s coming from your husband, your wife, your boss, saying, ‘Hey, Ben, we need you in the office at 6 a.m., I know it’s a voting

The remainder of this article is available in its entirety at CBN

Advertisement
Travel Berkey Water Filter System includes Black Filters and Fluoride Filters
0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
A Quick Note…

Already a subscriber? Login to remove advertisements. Not a subscriber? Join the Official Street Preachers and gain access to hundreds of presentations and exclusives that cover today's events and how they impact you, your life, and your soul. All while supporting independent Christian researchers trying to make a difference.