cover of episode UK’s internet watchdog finalizes first set of rules for Online Safety law

UK’s internet watchdog finalizes first set of rules for Online Safety law

2024/12/17
logo of podcast TechCrunch Industry News

TechCrunch Industry News

Topics
Ofcom: 英国网络安全法首套最终指导方针发布,标志着在线有害内容法首个合规期限的开始。该法案旨在保护用户免受各种非法内容的侵害,包括恐怖主义、仇恨言论、儿童性虐待和剥削以及欺诈和金融犯罪等。超过10万家科技公司可能受该法案的约束,不遵守规定将面临巨额罚款。该法案并非“一刀切”,对规模更大的服务和平台的义务通常更多。所有服务提供商都必须具备内容审核系统、用户举报机制、清晰的服务条款以及删除特定组织账户的功能等。 对于大型平台,可能需要进行更大的运营变革以避免违反法律义务,例如改变算法的工作方式,删除非法内容,保护儿童账户隐私等。该法律还引入了对高级管理人员的刑事责任,这意味着科技公司首席执行官可能因某些类型的违规行为而承担个人责任。Ofcom 将继续审查风险,并可能进一步发展对服务提供商的要求,以应对科技发展,例如生成式人工智能的兴起。Ofcom 还计划制定危机应对方案,例如封锁分享儿童性虐待材料用户的账户,并指导如何利用人工智能解决非法伤害问题。 Melanie Dawes: 科技公司需要改变算法的工作方式,删除非法内容,并保护儿童账户的隐私。Ofcom 将在1月份提出年龄检查要求,并在4月份最终确定更广泛的儿童保护规则,这将涉及色情内容、自杀和自残材料以及暴力内容等。

Deep Dive

Key Insights

What is the first compliance deadline for the UK's Online Safety Act?

The first compliance deadline for the UK's Online Safety Act is March 16, 2025, with safety measures required to be implemented by March 17, 2025.

What are the potential penalties for non-compliance with the Online Safety Act?

Non-compliance with the Online Safety Act risks fines of up to 10% of global annual turnover or up to £18 million, whichever is greater.

How many tech firms could be affected by the Online Safety Act?

More than 100,000 tech firms could be in scope of the Online Safety Act, ranging from large tech companies to very small service providers.

What specific illegal content types does the Online Safety Act address?

The Online Safety Act addresses over 130 priority offenses, including terrorism, hate speech, child sexual abuse and exploitation, and fraud and financial offenses.

What changes are tech companies required to make under the Online Safety Act?

Tech companies must change how algorithms work to prevent illegal content like terrorism, hate speech, and intimate image abuse from appearing in feeds. They must also ensure swift takedown of illegal content and set children's accounts to private to prevent contact from strangers.

What additional measures is Ofcom planning for child safety?

Ofcom plans to introduce requirements for age checks in January and finalize rules on wider protections for children in April, addressing pornography, suicide and self-harm material, and violent content.

What role does criminal liability play in enforcing the Online Safety Act?

The Online Safety Act introduces criminal liability for senior executives in certain circumstances, meaning tech CEOs could be held personally accountable for some types of noncompliance.

How does the Online Safety Act apply to smaller, lower-risk services?

Smaller, lower-risk services are not exempt from obligations and must comply with requirements such as having a content moderation system, mechanisms for user complaints, clear terms of service, and removing accounts of prescribed organizations.

What future measures is Ofcom considering in response to tech developments?

Ofcom is considering further measures to address risks from tech developments like generative AI, including crisis response protocols, blocking accounts sharing child sexual abuse material, and using AI to tackle illegal harms.

Shownotes Transcript

Translations:
中文

This is TechCrunch. This episode is brought to you by Factor.

Notice how the days are shorter but your to-do lists aren't? Here's a trick: Factor. From breakfast to dinner and anything in between, Factor has easy, nutritious options to keep you fueled and feeling your best. My box at Factor is on its way and it could not get here sooner. I'm so excited because you get to choose from six menu preferences to help you manage calories, maximize protein intake, or avoid meat, or simply eat a well-balanced diet.

Whether you like routine or you enjoy mixing things up, Factor has you covered with 35 different delicious meals every week and over 60 additional convenience options you can add to your box like keto cookies, pressed juices, and smoothies.

Don't let shorter days slow you down. Stay energized with America's number one ready-to-eat meal delivery service. Head to factormeals.com slash 50TCIndustry and use code 50TCIndustry to get 50% off your first box plus free shipping. That's code 50TCIndustry at factormeals.com slash 50TCIndustry to get 50% off your first box plus free shipping while your subscription is active.

On Monday, the UK's internet regulator Ofcom published the first set of final guidelines for online service providers subject to the Online Safety Act. This starts the clock ticking on the sprawling online harms law's first compliance deadline, which the regulator expects to kick in in three months' time.

Ofcom has been under pressure to move faster in implementing the online safety regime following riots in the summer that were widely perceived to have been fueled by social media activity, although it is just following the process lawmakers set out, which has required it to consult on and have Parliament approve final compliance measures.

This decision on the illegal harms codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm, Ofcom wrote in a press release. Providers now have a duty to assess the risk of illegal harms on their services

with a deadline of March 16, 2025. Subject to the codes completing the parliamentary process from March 17, 2025, providers will need to take the safety measures set out in the codes or use other effective measures to protect users from illegal content and activity. We are ready to take enforcement action if providers do not act promptly to address the risks on their services, it added.

According to Ofcom, more than 100,000 tech firms could be in scope of the law's duties to protect users from a range of illegal content types. In relation to the over 130 priority offenses the act sets out, which cover areas including terrorism, hate speech, child sexual abuse and exploitation, and fraud and financial offenses.

Failure to comply risks fines of up to 10% of global annual turnover or up to £18 million, whichever is greater. InScope firms range from tech giants to very small service providers with various sectors impacted including social media, dating, gaming, search and pornography. The duties in the Act apply to providers of services with links to the UK regardless of where in the world they are based.

The number of online services subject to regulation could total more than 100,000 and range from some of the largest tech companies in the world to very small services, wrote Ofcom. The codes and guidance follow a consultation with Ofcom looking at research

and taking stakeholder responses to help shape these rules since the legislation passed Parliament last fall and became law back in October 2023. The regulator has outlined measures for user-to-user and search services to reduce risks associated with illegal content. Guidance on risk assessments, record-keeping, and reviews is summarized in an official document which is linked in the text version of this article.

Ofcom has also published a summary covering each chapter in today's policy statement. The approach the UK law takes is the opposite of one-size-fits-all, with generally more obligations placed on larger services and platforms where multiple risks may arise compared to smaller services with fewer risks.

However, smaller, lower-risk services do not get a carve-out from obligations either. And, indeed, many requirements apply to all services, such as having a content moderation system that allows for swift takedown of illegal content, having mechanisms for users to submit content complaints, having clear and accessible terms of service, removing accounts of prescribed organizations, and many others.

Although many of these blanket measures are features that mainstream services, at least, are likely to already offer. But it's fair to say that every tech firm that offers user-to-user or search services in the UK is going to need to undertake an assessment of how the law applies to their business –

at a minimum, if not make operational revisions to address specific areas of regulatory risk. For larger platforms with engagement-centric business models, where their ability to monetize user-generated content is linked to keeping a tight leash on people's attention, greater operational changes may be required to avoid falling foul of the law's duties to protect users from myriad harms.

A key lever to drive change is the law introducing criminal liability for senior executives in certain circumstances, meaning tech CEOs could be held personally accountable for some types of noncompliance.

Speaking to BBC Radio 4's Today program on Monday morning, Ofcom CEO Melanie Dawes suggested that 2025 will finally see significant changes in how major tech platforms operate. What we're announcing today is a big moment, actually, for online safety because in three months' time, the tech companies are going to need to start taking proper action, she said. What are they going to need to change? They've got to change the way the algorithms work. They've got to test them so that illegal content like terror and...

and hate, intimate image abuse, lots more, actually, so that doesn't appear in our feeds. And then if things slip through the net, they're going to have to take it down. And for children, we want their accounts to be set to be private so they can't be contacted by strangers, she added.

That said, Ofcom's policy statement is just the start of it actioning the legal requirements, with the regulator still working on further measures and duties in relation to other aspects of the law, including what Dawes couched as wider protections for children that she said would be introduced in the new year. So, more substantive child safety-related changes to platforms that parents have been clamoring to force may not filter through until later in the year.

In January, we're going to come forward with our requirements on age checks so that we know where children are, said Dawes. And then in April, we will finalize the rules on our wider protections for children. And that's going to be about pornography, suicide and self-harm material, violent content, and so just not being fed to kids in the way that has become so normal but is really harmful today.

Ofcom's summary document also notes that further measures may be required to keep pace with tech developments such as the rise of generative AI, indicating that it will continue to review risks and may further evolve requirements on service providers. The regulator is also planning crisis response protocols for emergency events such as last summer's riots, proposals for blocking the accounts of those who have shared CSAM, child sexual abuse material, and guidance for using AI to tackle illegal harms.