Blog Post

Prmagazine > News > News > UK’s internet watchdog toughens approach to deepfake porn | TechCrunch
UK’s internet watchdog toughens approach to deepfake porn | TechCrunch

UK’s internet watchdog toughens approach to deepfake porn | TechCrunch

The UK’s Internet security regulator Ofcom has released another new draft guidance as it continues to implement the Online Security Act (OSA), a new set of recommendations designed to support companies within the scope to fulfill their legal obligations to protect Women and girls are protected from threats such as harassment and bullying online, misogyny and intimate image abuse.

The government said that protecting women and girls is a priority for its implementation of OSA. Some forms of misogynistic abuse (such as sharing intimate images without consent or using AI tools to create deep pornography targeting individuals – explicitly listed Implementation priorities.

Online safety regulations, approved by the UK Parliament September 2023despite criticism of non-compliance punishment, the task of reforming the platform giant is still not possible – up to 10% of global annual turnover.

Child safety exercisers are also frustrated by how long it takes to enforce the law and doubt whether it will have the desired effect.

In an interview BBC In January, even the technology minister who inherited legislation from the previous administration called it “very unbalanced” and “unsatisfactory.” But the government insists on this approach. A portion of the dissatisfaction around OSA can be traced back to the long delivery time that allowed the implementation of the regime, which required parliament to approve Ofcom compliance guidance.

However, enforcement is expected to address core requirements in illicit content and child protection soon. Other aspects of OSA compliance will take longer to implement. Ofcom acknowledges that this latest exercise advice will not be fully implemented until 2027 or later.

Close to the start line of law enforcement

“The first duty of the Online Security Act will take effect next month,” Jessica Smith of Ofcom, who leads the development of women’s safety-guided development, told TechCrunch in an interview. “So, we will Implement certain core responsibilities of the Online Security Act prior to this guide [itself becoming enforceable]. ”

A draft new guideline on ensuring the safety of women and girls online is intended to complement a broader guide to illegal content – ​​for example, it also provides advice for protecting minors to view adult content online.

In December, regulators released final guide on how platforms and services should be narrowed Risks related to illegal contentprotecting children is a clear priority.

It has also been produced before Child safety regulationsit recommends online services dial-up age checks and content filtering to ensure that children are not exposed to inappropriate content (such as pornography). Moreover, since it aims to implement an online security regime, it has also been developed Adult content website age guarantee technology recommendationsThe purpose is to promote pornographic websites to take effective steps to prevent minors from accessing age-appropriate content.

The latest guidance was developed with the help of victims, survivors, women’s advocacy groups and security experts. It covers four main areas where regulators say women are disproportionately harmed online, namely: online misogyny; pileup and online harassment; online domestic abuse; and intimate image abuse.

Design safety

Ofcom’s best-in-class advice urges a “security” approach to scope services and platforms. Smith told us that regulators want to encourage tech companies to “take a step back” and “think about their user experience in a round.” Although she acknowledged that some services have taken some measures to help narrow online risks in the field, she believes there is still a lack of holistic thinking when it comes to prioritizing the safety of women and girls.

“What we really want is a step-by-step change in the way the design process works,” she told us. The goal is to ensure safety is in mind in product design.

She highlighted the rise of images that produce AI services, which she noted has led to a “massive” growth in deep fishing private abuse as a risk that technicians can take positive measures to commit their tools to target their weapons. Risk to target women and girls. – But no.

“We think that during the design phase, the service can do something smart, which will help address the risks of some of these hazards,” she advised.

Examples of “good” industry practices in the guide include the following actions taken by online services:

  • Delete geolocation by default (reduce privacy/tracking risks);
  • Perform “abuseful” tests to determine how services are weaponized/abuse;
  • Take measures to improve account security;
  • Before posting abused content, design in user prompts that make the poster think twice before doing it;
  • And provide accessible reporting tools to enable users to report problems.

As with all OFCOM’s OSA guidelines, not every measure is related to every type or size of services – because the law applies to online services of size and in a variety of fields, from social media to online dating, games, games, Forums, and messaging applications, to name just a few. Therefore, a large part of the work of an in-house company will be to understand what compliance means in its product context.

When asked if Ofcom has identified any services that currently meet the guidelines standards, Smith suggested they did not. “There is still a lot of work to be done in the entire industry,” she said.

She also acknowledged by default that given some of the major industry players have taken some retrograde steps to trust and security, there may be growing challenges. For example, since taking over Twitter and renamed the social network X, Elon Musk has been obscuring the achievements of his trust and security personnel – in favor of his approach to pursuing his maximum freedom of speech.

Meta, which owns Facebook and Instagram, appears to have taken some imitation steps in recent months, saying it is ending fact-check contracts for 30 parties, and favoring the deployment of X-style “community annotation” systems to crowdsourcing tags, Content involves tags for content disputes, for example.

transparency

Smith suggests Ofcom’s reaction to this advanced change—the operator’s actions may have the potential to dial instead of suppressing online hazards—will focus on using transparency and information gathering power, which swings the impact under OSA to illustrate the impact and drive user-driven user. consciousness.

So, in short, the strategy here seems to be “name and shame”, at least in the first place.

“Once we have determined the guidance, we will generate a [market] Report… about who is using guidance, who follows what steps they have achieved for women and girls users and really articulates the protection on different platforms so that users can she have on what they spend online Time makes smart choices. ” she told us.

Smith suggests that the hope of avoiding the risk of public humiliation due to poor performance in women’s safety will be able to turn to Ofcom’s “practical step” guidance to understand how to improve users’ situations and address the risks of reputational hazards.

“Platforms operating in the UK will have to comply with UK law,” she added. “So, this means compliance with illegal jeopardy duties and child protection duties under the Online Safety Act.”

“I think this is where our transparency powers are also entering – if the industry is changing direction and the harm is increasing, then we will be able to share relevant information with UK users, media, MPs and UK users.”

Techniques for dealing with deep glue pornography

One online harm that OFCOM explicitly strengthens its recommendations before actively launching OSA enforcement is private image abuse – because the latest draft guidelines suggest that hash matching is used to detect and delete such abuse images, while earlier recommendations do not conduct. That far away.

“We have included other steps in this guide beyond what we have already proposed in the code,” Smith noted, confirming Comc’s plan to update its earlier code to combine this change “in the near future” .

She added: “So, this is a way to say it to the platform and you can get ahead of that mandatory requirement by following the steps set in this guide.”

OFCOM recommends using hash matching technology to offset Smith’s significantly increased risk, especially related to the abuse of Deepfake images generated by AI, so matching technology is used to offset image abuse.

“The amount of deep private abuse reported in 2023 is more than the sum of the past few years combined,” she noted. He added that Ofcom also gathered more evidence on the effectiveness of hash matching to address this harm.

The entire draft guidance will now be consulted – OFCOM invites feedback until May 23, 2025), and final guidance will be developed afterwards by the end of the year.

The entire 18 months after this, Ofcom will produce its first report to review industry practices in the field.

“We are going to enter 2027 [to protect women and girls online] – But now there is nothing to stop the platform’s action. ” she added.

She criticized OSA’s criticism, saying it was right for regulators to negotiate compliance measures. But, as the final measures take effect next month, Ofcom expects the conversation around the issue to change as well.

“[T]She predicts that the hat will indeed really start to change the conversations on the platform.

Source link

Leave a comment

Your email address will not be published. Required fields are marked *

star360feedback Recruitgo