8

Your compliance obligations under the UK’s Online Safety Bill; or, welcome to he...

 1 year ago
source link: https://webdevlaw.uk/2022/07/11/your-compliance-obligations-under-the-uks-online-safety-bill/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Last month I wrote a post about the UK’s “world-leading” vision for age-gating the open web. It got a bit of attention. That post, sadly, encompassed only one aspect of your compliance obligations under the Online Safety Bill. In this post, I’m going to tell you about the rest.

I apologise in advance.

Preamble from the room where it happens

20220706_160356-225x300.jpg

“This is fine”

This post was tricky enough to write, but on top of that, last week I found myself in the Palace of Westminster, representing my employer at a roundtable of small tech businesses and startups who stand to be collateral damage in the UK’s determination to regulate the internet around Facebook, via this Bill. That meeting gave me a chance to work through some of these concerns and to sharpen others.

Or it would have, if the nine MPs across three parties who were scheduled to attend actually showed up.

Only one did. 

And he’s a good ‘un who “gets it”, but who also has skin in the game about being on the receiving end of the most horrific online abuse and wants us to help him and people like him.

That is the kind of person we should be working with, and doing business with, to help him and people like him without throwing everyone and everything else under a bus. Me, I will sit down and work with representatives like him any time.

Where was everyone else? Well, you could assume that they were off playing real-life Game of Thrones; Boris Johnson’s time was up, you see, so they were all elsewhere drawing up factions and sharpening knives.

Or you could be cynical and say that a meeting about small businesses like yours held no interest for them; there’s no headline, no PR, no wild west hero sheriff fantasy, no “big tech crusader” mantle to claim for anyone sitting in a room with the likes of me and, by representative extension, the likes of you.

So the roundtable was a real-life version of the “this is fine” meme, as everyone who was in attendance sipped their coffee and nibbled their patisserie and chatted amiably while the room was on fire.

Still, being on one side of the building while the government was literally collapsing on the other side of the building, and then heading up the street to have a giggle fit at the party in front of No 10 (link NSFW), made it a meeting I’ll never forget. I mean, what are the chances that I walk into Parliament for the first time in two and a half years and the government implodes?

Whoopsie.

Now let’s amble.

I want to help you understand what the UK’s draft Online Safety Bill will mean for you and your work on the open web. This post is my attempt to explain the compliance obligations, as they’ve been drafted, and how they will hit you.

By “you”, I mean anyone working in a company or a project which will fall in scope of the Bill, whether that’s your paid work, your software community, or your personal hobby.

And by “you”, I mean a company which is Not Big Tech, or as I’ll call it for the purposes of this post, NBT. They have their compliance departments, legal teams, and policy specialists. You don’t.

You” also means an NBT whose product or service does not engage in legal and consensual adult content or conduct. That means porn. And that means porn either as your main business model or as some of the content on it; because if you are, you’ve got specialist compliance experts for that too.

As with the previous post, this is tremendously long: 4100 words. There’s really no way to make it shorter. You’re going to need something slightly stronger than coffee. Please drink as responsibly as a Conservative MP with poor impulse control at the Commons bar in the middle of the day.

It goes without saying that this is not legal compliance advice. It also goes without saying that this is a draft Bill, not quite “the law” yet, so anything I write here is subject to change.

Additionally, I present this post for information only: so don’t shoot the messenger. Not all of these ideas are necessary, proportionate, or even feasible. I’m presenting them to give you the facts you need to work with.

How to read this post

This post reflects the second draft version of the Online Safety Bill, plus amendments as was published on 28 June 2022. Unfortunately, it is only available in PDF (that link opens up the document, which is 230 pages).

When you see a numbering system following an excerpt from the Bill, that’s my shorthand for the Part, Chapter, Clause, and Paragraph it came from within the draft Bill text. So for example, 3/2/8/(5) refers to Part 3, Chapter 2, Clause 8, Paragraph 5. I wouldn’t have to do this if legislators published legislation in open text formats rather than PDFs. Pfft.

As you read this post, you should also hold everything in it in the light of these two questions.

The first is:

how will my having to do this address online harms and make the web a better place?

And the second is:

Why is this government throwing me, and my team, and my project, under a bus, with these compliance requirements and obligations, in order to get back at “big tech”?  

If you can’t come up with an answer to either of these questions, that itself is the answer.

I have divided this guide into six areas:

  1. Is your work in scope?
  2. Compliance assessment obligations
  3. Administrative obligations
  4. General monitoring obligations
  5. Compliance costs
  6. What can you do?

Is your work in scope of the UK’s Online Safety Bill?

Is it possible for your site, service, or app, which allows content to be shared and/or people to communicate with each other, to be accessed by any adult or any child within the UK?

Then you’re in scope.

NB “accessed” doesn’t necessarily mean that a user can set up an active account on your service. If a British adult can merely download your app on the app store, the app is in scope. If a British child could merely type your URL into a browser, the site is in scope.

This Bill has been aggressively promoted as being about “big tech” “social media” “tech giants”, but it is not, and it never was. The ongoing line that it’s here to “rein in the tech giants” is, and always has been, bullshit. In fact, I’m going to start being really hardline about this by saying that anyone – be it politicians, media, or civil society –  who still discusses it as being about “big tech” and “social media” and “tech giants” is spreading disinformation.

And you folks need to stop that.

So the bottom line is that it’s easier and safer to assume that you and your work are in scope, than to assume that you are not.

And don’t forget that this regulatory regime is expected to be extraterritorial. If you are not in the UK but your site, service, or app can be accessed by anyone in the UK, you’re fair game.

Compliance assessment obligations

First let’s talk about the paperwork. As you’ll recall from the previous post, the government’s digital regulation strategy is to scrap EU bureaucracy and paperwork and red tape in order to, erm, make way for UK bureaucracy and paperwork and red tape. Bumf, but British bumf!

First and foremost of these are the risk assessment obligations you will be required to devise and produce to keep Ofcom, as the online harms regulator, happy. This is a result of the Bill’s attempt to transfer the offline “health and safety” model to the open web, in the belief that online harms are a series of trip hazards which can be nailed down with proper risk assessments.

I’ve had a good few years to reflect on this model of internet regulation, and it finally dawned on me that the “trip hazard model” was a Freudian slip that gives the game away. The intention is not to prevent companies from laying “trip hazards”. The intention is to use the legislation to lay trip hazards in front of companies, in the form of these impossible risk assessment compliance processes, which exist solely to create the paperwork needed to set you up to fail.

It’s these assessments that are the trip hazards, on purpose.

Sometimes risk assessments can be good. However much everyone rats on GDPR, the privacy impact assessment is a priceless opportunity to ask open-ended questions, and follow where they lead, to prevent problems from ever happening down the road and mitigating the ones already in play. Those questions, of course, are based in international standards and human rights principles. What’s on the table here, by contrast, don’t seem to be open-ended questions nor the upholding of international principles. This is table-pounding which demands: prove yourself, or else.

So what are these assessments? Your NBT will have to go through these, at the very minimum. Their specific shape, size, and requirements are yet to be determined – that’s for Ofcom down the road, as with so much else of this legislation – but what is about to be hammered down in law are the following:

  1. An illegal content risk assessment
  2. Your duties towards that illegal content
  3. Your duties towards the reporting of content
  4. Your duties towards establishing complaints procedures
  5. Your duties about freedom of expression and privacy (this is all the Tory culture war BS landing on your doorstep)
  6. Your duties about record-keeping and review of your content moderation and takedown policies
  7. Your child access assessment (the age gating)
  8. A child risk assessment
  9. Your duties about children’s safety
  10. Transparency reports

Let’s zero in on just two of those: the illegal content risk assessment and the child safety risk assessment.

Risk assessment hell

For these two risk assessments, you will be required to identify the following, in writing:

  1. Your user base
  2. The level of risk to users of encountering each kind of illegal content (generally terrorism and CSAM)
  3. The level of risk to users of encoutering other illegal content
  4. The level of risk of harm to individuals presented by illegal content of different kinds
  5. The number of children accessing the service by age group
  6. The level of risk to children of encountering each kind of primary priority content (this means porn and content relating to self-harm, suicide, and eating disorders)
  7. The level of risk to children of encountering each kind of priority content (meaning online abuse, cyberbullying, harassment, harmful health content, or content depicting or encouraging violence)
  8. Each kind of primary priority or priority content which is hamrful to children or adults, with each separately assessed
  9. The presence of any non-designated content which is nevertheless harmful to children
  10. The level of risk of harm presented by different descriptions of content, for children, that is harmful, by age group
  11. The level of risk of functionalities allowing users to search for other users, including children
  12. The level of risk of functionalities allowing users to contact other users, including children
  13. The level of risk to adults of encountering other content that is harmful
  14. The level of risk of functionalities of the service facilitating the presence or dissemination of illegal content, identifying and assessing those functionalities that present a higher risk
  15. The different ways in which the service is used, and the impact that has on the level of risk of harm that might be suffered by individuals
  16. The nature and severity of the harm that might be suffered by individuals by the above, including age groups
  17. The design and operation of the service, including the business model, governance, and other systems and processes that might reduce or increase the risks of harms

I probably missed something in there, but you’re probably curled up in a ball crying as is.

Oh, see that last one? Every time you make a change to your business model, governance, or systems or processes which might theoretically increase the risk of any subjective online harm, you’ll be expected to check first with Ofcom, as the regulator.

Having fun yet? There’s more!

Administrative obligations

In addition to the risk assessments, you will have administrative compliance obligations to Ofcom as your content regulator. These include

  1. The requirement to register with them as a service provider in scope of the law
  2. The requirement to pay them an annual fee
  3. The requirement to respond to information notices (e.g. requests for them for clarification on anything we’ve discussed in this post)
  4. The requirement to designate a person to prepare a report to Ofcom when they come knocking
  5. The requirement to assist the person in preparing that report
  6. The requirement to cooperate with them in an investigation
  7. The requirement to attend an interview with them when they physically call you in
  8. The requirement to make a public statement in certain specific and extreme circumstances
  9. The requirement to report any CSAM/CSEA on your service to the National Crime Agency

No we’re not done yet.

General monitoring obligation

You probably don’t want to get into this today, but if you weren’t aware, I’ll just cut and paste this from a previous post:

The draft Bill’s explanatory notes provide a reminder (see page 12) that

Article 15 of the eCD also contained a prohibition on the imposition of requirements on service providers to generally monitor content they transmit or store, or to actively seek facts or circumstances indicating illegal activity. […] there is no longer a legal obligation on the United Kingdom to legislate in line with the provisions of the eCD following the end of the transition period on 31 December 2020.

Having swept 25 years of intermediary liability into the bin, the draft Bill text then goes on to establish a general monitoring obligation for both illegal content and legal content, which of course, means anything conveniently stuffed into the rubric of “children’s safety”.

Massive increase in Ofcom powers to require proactive monitoring by use of technology (S.116). Previously prohibited except for terrorism and CSEA. pic.twitter.com/OnG52rwis9

— Graham Smith (@cyberleagle) March 17, 2022

In other words, the UK is ditching the prohibition on a general monitoring obligation – its own equivalent of the US Section 230 – specifically because it came from the EU and must be thrown out with the bathwater. In its place it becomes the first western nation to impose a general monitoring obligation over both illegal and legal and subjective content. Yay taking back control!

And that’s gonna cost you.

Compliance costs

In addition to the costs of your time and labour in complying with all of the above, how much is this regime going to run you, out of pocket?

Well, your NBT in scope of the Online Safety Bill is facing at least four kinds of ongoing compliance costs.

Regulator fees

The first cost you’ll incur is paying an annual fee to the regulator (Ofcom) for the privilege of being regulated by them. (6/71)

This is obviously intended to be similar to the annual data protection fee you pay to the ICO. However, the latter system is about working with the regulator to uphold your users’ fundamental right to privacy, while the former system is about working with the regulator to decimate your users’ fundamental right to privacy.

So we’re off to a flying start, then.

The amount of the fee has not yet been determined; it’s down to Ofcom to ensure that it is “justifiable and proportionate” (6/75/2/b).

However, my guiding rule is that you shouldn’t put too much trust in the words “justifiable and proportionate”, not when twelve years of Conservative attitudes about the internet have worked from the default position that you people are all filthy vermin who are complicit in child abuse and terrorism, and the burden of proof is on you to demonstrate otherwise.

(Remember VATMOSS? This makes that look like primary school.)

Clause 73 also discusses the establishment of a threshold figure. The explanatory notes state OFCOM will be funded via fees from providers of regulated services whose qualifying worldwide revenue is equal to or greater than the specified threshold as determined by clause 73. This clearly means that not every business in scope will be required to pay an Ofcom fee, but does that mean that businesses which don’t have to pay the fee will still face the compliance requirements regardless? And can somebody find out?

Compliance help

The second cost you’ll incur, as an NBT, is the legal advice which you will need to take to understand your compliance obligations: in other words, the expert advice you’ll need to bring in to help you know what it is you’re expected to do, lest you get shouted at that you are failing to meet your duty of care to Britain’s children.

Paragraph 124 of the Impact Assessment projects that cost as follows, and I have included the screen cap so that you understand that I am not making this up:

Screenshot-2022-04-28-at-10.35.31-1024x381.png

one regulatory professional at an hourly wage of £20.62 is expected to read the regulations within each business […] the explanatory notes are approximately 52,000 words and would therefore take just over four hours based on a reading speed of 200 words per minute.

In other words, your NBT is expected to locate the mythical regulatory compliance professional who

  • Understands your NBT from the top of its business model to the bottom of its codebase;
  • Understands how the Online Safety Bill will work in practice;
  • Understands how the Online Safety Bill will link up with every aspect of your NBT; and
  • Is willing to do all of this, from start to finish, in half a day, for around ninety quid.

Wow. Okay.

Let me put it this way, folks: as a regulatory professional, if you want me to even look at you, you need to move the decimal point in £20.62 one digit to the right. If you want me to actually listen to you, you need to move it another digit to the right.

Per day.

Mandatory age-gating software

The third compliance cost for your NBT will be implementing a third-party age verification or age assurance system, to identify the ages of everyone who accesses your service, even if they are not and never become actual users or your service, because if you don’t, YOU HATE THE CHILDREN.

We discussed this in the previous post, the 3000-word, top-of-Hacker-News one.

If you need to pause now and go find someone to give you a hug, I won’t hold that against you.

General monitoring software

The fourth compliance cost will be the scanning and monitoring services to detect both illegal as well as legal and subjective content, as we discussed above.

Because unless you spend every waking minute of your life in God-mode, monitoring and reading every keystroke your users type in every interaction with your service and every pixel they exchange with other humans, and rush in to correct or edit or censor anything that might be legal but subjectively harmful, you’re going to need to install some sort of automated screening software.

You’re not going to do that because you’re a horrible nosey internet stasi. You’re going to do that because your compliance obligations say you have to do that, or else duty of care the children etc etc blah blah; or if you’re lucky enough to have a senior management position in your organisation, it’s your personal freedom and your bollocks on the line, as you could well face criminal charges for being uncooperative.


At this point I want to share an email I received from a follower in response to last month’s post:

It seems to me that the Bill (like most attempts to “regulate big tech”) will actually reinforce the position of the incumbents, by further raising the bar for any smaller entity that wants to compete. The cost of implementation is likely to show big economies of scale – so favour the already big – and any attempt to negotiate on what the obligations are will require lawyer-time on a scale that only the big can afford. Littler people will be welcome to use the big platforms, I’m sure. So a Bill that is being widely sold as “beating up Facebook” will actually strengthen them. Whether that is the intent, I’ve no idea, but I think a lot of people are going to be surprised and disappointed.

He’s absolutely right. There was a time, about a year ago, that I made this meme as a joke:

Roll Safe meme:

What I’ve come to realise since then is it’s not a joke. That’s the intention. Make it too prohibitive, risky, or impossible for public discourse to flow on smaller platforms and services; require the larger ones to become speech police and societal monitors; politicise the hell out of it and threaten tech workers with jail until they comply.

It’s not that they don’t realise they’re throwing you and your work under a bus.

It’s that they do.


So what can you do?

The Bill is in its report stage this week, just before Parliament heads off for summer recess, and any legislative programme – bad or good – takes a firm backseat to Tory Game of Thrones.

And I hate to break it to you, but this is starting to look like a lost cause.

As I learned myself last week, politicians aren’t interested in small businesses or projects like yours. If there’s no angle that will allow them to present themselves as crusading heroes “taking on the tech giants”, they’re not interested.

Those of you who have engaged with your representatives have tended to receive response letters full of stock messages about social media, and online harms. Remember what I said at the beginning of this post, about how this Bill has been aggressively promoted as social media legislation? Most elected representatives think that’s what it is, because that’s what they have been told too. They aren’t aware of the nuances or the complexity or the scope or the collateral damage either.

And the people who do understand those complexities, meaning the money machine that has driven this Bill – the one that only communicates with the public through paywalled, adtech-riddled, PR agency-drafted op-eds in right-leaning broadsheets – has far more power and influence than all of you put together ever will.

So what can you do? As the past month has shown, you could choose to let the people responsible for this Bill divide and conquer among themselves, taking the Bill with them. This Conservative Bill, for example, has rent the Conservative party into bitter factions. The Telegraph, whose idea this Bill was in the first place and who gleefully crusaded for it for years, is not only backtracking on its own Bill but is pretending like we all haven’t noticed. And even the civil society organisations who have supported this Bill are souring on it as they realised they’ve been used too. A little nudge might go a long way.

You could, also, stage your own campaign. Announce that you’re blocking UK users. Announce that you’re pulling out of the UK. Announce that you will not partner with the UK government in identifing all your users and surveilling their conversations, on the assumption that they are all deviant criminals. Send out a note to your users warning them that you may have to terminate their accounts if this Bill passes, and let them run with their anger. Take a stand. Defend your users’ rights. Be at the table, rather than on the menu.

But whatever you choose to do, you can’t just oppose what’s wrong. You have to come up with an alternative plan to put it right. Because your opposition to this Bill, and what it’s going to do to your work and to your users, will be taken as an endorsement of the status quo as well as a confession of guilt that you are complicit in those things. That’s the tactic, you see: any opposition to this Bill is either lobbying or “intransigence” or collusion with Big Tech.

And you are not guilty of any one of those things. So come prepared for battle, because just by being yourself, you have more ammunition than you think.

I have my own ideas – a creative exercise, if you will – about how this Bill could (and should) be scrapped over, and restarted from scratch, on a far better footing.

But you’ve read enough for today.

Header image by me, 6 July 2022: the room where it happens, snapped on my way to “The Room Where It Happens”. It was actually a very lovely summer’s day, but I’m committed to the black and white aesthetic.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK