• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

like that social media companies must as well close shop

gsbslut

Stupidman
Loyal
UK's new online safety law adds to crackdown on Big Tech companies

Britian Tech Regulation
British lawmakers have approved an ambitious but controversial new internet safety law with wide-ranging powers to crack down on digital and social media companies like TikTok, Google, and Facebook and Instagram parent Meta.
The government says the online safety bill passed this week will make Britain the safest place in the world to be online. But digital rights groups say it threatens online privacy and freedom of speech.
The new law is the UK’s contribution to efforts in Europe and elsewhere to clamp down on the freewheeling tech industry dominated by U.S. companies. The European Union has its Digital Services Act, which took effect last month with similar provisions aimed at cleaning up social media for users in the 27-nation bloc.
Here's a closer look at Britain's law:
The sprawling piece of legislation has been in the works since 2021.
The new law requires social media platforms to take down illegal content, including child sexual abuse, hate speech and terrorism, revenge porn and posts promoting self-harm. They also will have to stop such content from appearing in the first place and give users more controls, including blocking anonymous trolls.
The government says the law takes a “zero tolerance” approach to protecting kids by making platforms legally responsible for their online safety. Platforms will be required to stop children from accessing content that, while not illegal, could be harmful or not age-appropriate, including porn, bullying or, for example, glorifying eating disorders or providing instructions for suicide.
Social media platforms will be legally required to verify that users are old enough, typically 13, and porn websites will have to make sure users are 18.
The bill criminalizes some online activity, such as cyberflashing, which is sending someone unwanted explicit images.
The law applies to any internet company, no matter where it’s based as long as a UK user can access its services. Companies that don't fall in line face fines of up to 18 million pounds ($22 million) or 10% of annual global sales, whichever is greater.
Senior managers at tech companies also face criminal prosecution and prison time if they fail to answer information requests from UK regulators. They'll also be held criminally liable if their company fails to comply with regulators' notices about child sex abuse and exploitation.
Ofcom, the UK communications regulator, will enforce the law. It will focus first on illegal content as the government takes a “phased approach” to bring it into force.
Beyond that, it’s unclear how the law will be enforced because details haven’t been provided.
Digital rights groups say the law's provisions threaten to undermine online freedoms.
The U.K.-based Open Rights Group and the Electronic Frontier Foundation in the U.S. said that if tech companies have to ensure content is not harmful for children, they could end up being forced to choose between sanitizing their platforms or making users verify their ages by uploading official ID or using privacy-intrusive face scans to estimate how old they are.
The law also sets up a clash between the British government and tech companies over encryption technology. It gives regulators the power to require encrypted messaging services to install “accredited technology" to scan encrypted messages for terrorist or child sex abuse content.
Experts say that would provide a backdoor for private communications that ends up making everyone less safe.
Meta said last month that it plans to start adding end-to-end encryption to all Messenger chats by default by the end of year. But the UK government called on the company not to do so without measures to protect children from sex abuse and exploitation.

© Copyright 2023 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
 
Top