Blog Post

Prmagazine > News > News > AI ‘digital twins’ are warping political reality, leaving deepfake victims with few options for legal action
AI ‘digital twins’ are warping political reality, leaving deepfake victims with few options for legal action

AI ‘digital twins’ are warping political reality, leaving deepfake victims with few options for legal action

Join Fox News to access this content

Plus your account’s special access selection articles and other quality content – free.

By entering an email and pushing for a continuation, you agree to Fox News terms of use and Privacy Policywhich includes ours Economic Incentive Notice.

Please enter a valid email address.

Artificial Intelligence (AI) is producing surrealistic “digital twins” of politicians, celebrities, pornographic materials, etc. – Stay DeepFake Technology Efforts to determine legal recourse.

Former CIA agent and cybersecurity expert Dr. Eric Cole told Fox News Numbers This poor online privacy practice and people’s willingness to post information publicly on social media makes them vulnerable to AI.

“The cat is out of the bag,” he said.

“They have photos of us, know our children, know our families. They know where we live. Now, with AI, they are able to get all the data about who we are, how we look, how we act and how we act and basically being able to create digital twins,” Cole continued.

Remember these tips to avoid being deceived by AI-generated deep strikes

Mark Zuckerberg

Images generated by AI (called “deep hits”) often involve editing videos or photos of people to make them look like someone else or use their voice to make statements they have never said in reality. (Elyse Samuels/The Washington Post/Lane Turner/Boston Globe/Stefani Reynolds/AFP via Getty Images)

He claims that the digital twin is so great that it is difficult to tell the difference between the artificial version and the real person on which the deep strike is based.

Last month, Donald Trump Jr.

Posts There was extensive discussion on social media, and it seemed to be a clip from the podcast’s episode “Triggered with Donald Trump Jr..”

Experts in digital analytics later confirmed that Trump Jr.’s voice was created using AI, noting that the technology has become more “skilled and complex.”

FactpostNews, AN The official statement of the Democratic Party, Posted the audio, it seemed to be real. The account later deleted the recording. Another Republican about opposing Trump also released the clip.

Over the past few years, many examples of AI deep strikes have been used to mislead viewers related to political content. The 2022 video shows that it seems to be Ukrainian President Vodimir Zelensky Surrender to Russia – but fake editing was poorly produced and only briefly spread online.

Manipulating President Donald Trump’s videos and Former President Joe Biden Later it appeared in the advancement of the 2024 US presidential election. According to existing videos, these clips often change the words or behavior of Trump and Biden.

AI-generated porn, including celebrity fake nudes, continues until Etsy as Deepfake Lag “lagged behind”

Trump and Obama's Ai

A woman in Washington, D.C. watched a manipulation video on January 24, 2019, which changed what President Donald Trump and former President Barack Obama said and illustrates the development of Deepfake Technology. (Rob Lever /AFP via Getty Images)

Images generated by AI (called “deep hits”) often involve editing videos or photos of people to make them look like other people using AI. DeepFakes hit public radar in 2017 as Reddit users released realistic celebrity porn to the platform, opened floodgates to users using AI to make your images look more convincing and shared images more widely over the next few years.

Cole told Fox News that people are their “his own biggest enemy” about AI’s deep strikes, and limiting online exposure may be the best way to avoid becoming victims.

But, in politics and media, “visibility is key”, Public figures Become the main target for evil AI use. Interested in copying President Trump’s threat actors will have plenty of feed to create digital twins that siphon the data of American leaders in different environments.

Congress must stop a new AI tool to develop children

“The more videos I can watch, the more he walks, the way he speaks, the way he acts, I can feed it into the AI ​​model and can make the deeper development of the same reality as President Trump. That’s where things get very, very scary.”

In addition to taking personal liability for selling personal data online, legislation could be another way to reduce improper use of AI, Cole said.

Sens. TedCruz, R-Texas and D-Minn. Knockdown behavior, This would make federal crimes that publish or threaten to publish involuntary intimate images, including “digital forgery” made by artificial intelligence. The bill was unanimously passed in the Senate in early 2025, and Cruz said in early March that he believed the House would pass before it became law.

Melania Trump on the mountain

First lady Melania Trump traveled to Capitol Hill on Monday for a roundtable meeting to rally support Take It Down Act. (Fox News)

The proposed legislation would be sentenced to up to three years in prison for sharing involuntary intimate images (real or AI-generated), involving minors, and for two years for those images involving adults. For crimes involving threats involving minors, a maximum of two and a half years in prison and a year and a half for threats involving adults.

The bill also requires social media companies such as Snapchat, Tiktok, Instagram and similar platforms to remove such content within 48 hours of victim notification.

High school student, parents warn about threats of nude photos

Melania Trump returned to the White House for the first time since speaking on Capitol Hill earlier this month, attending lawmakers and Victims of revenge porn and AI-generated deep strikes.

Melania Trump said on March 3: “Today I am with you with a common goal to protect our young people from online harm.

Andy Locascio, co-founder and architect of Eternos.Life (thanks to the credit for building the first digital twin), said that while the “knockdown” behavior is “out of effort”, it is totally unrealistic to assume it will work. He noted that the majority of the AI ​​Deepfake industry is provided from locations that are not subject to U.S. law, and the legislation may affect only a small number of illegal sites.

Obama and Jordan Peele Deepfake

National security expert Paul Scharre watched a video of Buzzfeed’s manipulation with film producer Jordan Peele (R on screen) and used easy-to-use software and applications to change what former President Barack Obama (L on screen) said to illustrate how Deepfake Technology fooled the audience in Washington, D.C., January 25, 2019, January 25, 2019, January 25, 2019. (Rob Lever/AFP via Getty Images)

He also noted that text-to-speech cloning technology can now create “perfect fakes.” While most major providers have important control to prevent the creation of fakes, Locascio told Fox News Digital that some commercial providers are easily fooled.

Additionally, Locascio says anyone with a reasonably powerful graphics processor unit (GPU) can build their own voice model that is able to support “clone”. Some available services require less than 60 seconds of audio to produce this feature. The basic software can then be edited to make it more convincing.

Democratic senator targeted by Deepfake imitators of Ukrainian officials called: Report

“The realism paradigm about audio and video has changed. Now everyone has to assume that what they see and hear is fake until it is proven to be real.”

While there is little criminal guidance on AI Deep Tobacco, attorney Danny Karon said the alleged victims can still file civil lawsuits and be awarded money losses.

In his forthcoming book, Your Lovely Attorney’s Legal Health Guide: With Deceiving You to Deceive Your World, Caron points out that AI Deepfakes belongs to traditional slander law, especially slander, involving the spread of false statements through literature (writing, pictures, audio and video).

Ukrainian President Vodimir Zelensky's deep strike

This illustration photo, taken on January 30, 2023, shows a phone screen showing a statement from the head of Meta’s security policy, including Ukrainian President Volodymyr Zelensky (Olivier Douliery/AFP via Getty Images)

To prove defamation, the plaintiff must provide evidence and arguments on specific elements that meet the legal definition under state law. Many states have similar standards in proving defamation.

For example, according to Virginia law, like Depp v. The judgment heard, Actor Johnny Depp’s team must meet the following elements that constitute slander:

  • The defendant made or issued a statement
  • The statement is about the plaintiff
  • This statement has a defamatory meaning to the plaintiff
  • The defamatory meaning is designed and intended by the defendant
  • Due to the circumstances surrounding the publication, it can hatch the defamatory meaning of the person who sees it

“You can’t conclude until you know what law and slander are. For example, Amber didn’t hear that, which is why she didn’t think she was doing something wrong. It turned out she was garbage. She got stuck in bullshit, she paid all the money. She paid all that money. It’s the analysis that people need to go through to avoid trouble because it’s because of what’s involved online,” Karon said. ”

Caron told Fox News Digital that AI DeepFake claims can also be guided by intrusion into privacy laws, intrusion into laws, civil tracking and publicity rights.

Federal judge blocks California law banning deep-seated lawsuits for elections

Bruce Willis Deep Strike

Bruce Willis’ surreal image is actually a deep bubble created by Russian companies created using artificial neural networks. (via Reuters deep cake)

“If Tom Hanks has recently had voice choices to promote dental programs, it’s an example of a company that uses someone’s name, image and similarity, in which case, sell products to promote or get publicity from others. You can’t do that.”

Unfortunately, problems may arise if the plaintiff cannot determine who created the deep smell or the perpetrator is located in another country. In this case, those who wish to seek a defamation case may have to hire a network expert to find the source of the content.

If an individual or entity is international, then this will become a place issue. Even if a person is found, the plaintiff must determine the answer to these questions:

  • Can it be delivered to individuals?
  • Will foreign countries help promote this?
  • Will the defendant appear at trial?
  • Did the plaintiff reasonably raise funds?

If the answer to some of these questions is no, it may not be worth investing in the time and finances required to pursue this claim.

“Our rights have only the ability to be like patents, just like we can execute them. People say, ‘I have patents, so I am protected.’ No, you are not a patent, like you can execute.

Click here to get the Fox News app

Brooke Singman and Emma Colton of Fox News contributed to the report.

Source link

Leave a comment

Your email address will not be published. Required fields are marked *

star360feedback