Or, simply put, the "censorship bill." Has anyone else seen this thing? Two hundred twenty-five pages of overly vague nonsense, shifting the burden of censorship to the online platforms and providers for no good reason.
Since most of you don't have the time to waste reading this thing (advantages to being at least semi-retired, y'know...) let me hit a few highlights. You can view the bill and follow along at https://docs.reclaimthenet.org/uk-online-safety-bill-final.pdf ...
Starting with Part 2 - "Key Definitions"
- Under "User-to-User Service" and "Search Service," "(1) In this Act, "user to user service" means an Internet service by means of which content that is gnerated directly on the service by a user of the service, or uploaded to or shared on the service by a user or the service, may be encountered by another user, or other users, of the service. (2) For the purposes of subsection (1); (a) it does not matter if content is actuallyshared with another user or users as long as a service has a functionality that allows such sharing; and (b) it does not matter what proporton of content on a service is content desribed in that subsection." [emphasis mine]
Interpretation: For services that are like or could be like, say, Facebook or X, they can be regulated under this law. Doesn't matter if you have content sharing turned on or off; or if, out of everything you have on your server, you have only one small file shared. It's enough.
Part 3, Chapter 1, Part 5, Subsection 7 "Chapter 7 is about the interpretation of this Part, and it inclues definitions of the following key terms: 'illegal contant,' 'terrorism content,' 'CSEA content' and 'priority illegal content' (see section 52); 'Primary priority content that is harmful to children,' 'priority content that is harmful to children,' and 'content that is harmful to children' (see section 53); 'priority content that is harmful to adults' and 'content that is harmful to adults' (see section 54); "
For "illegal content" &c, the only one I have to look up is "CSEA content" (it's not defined in section 52, either,) but I can see censoring such content, and that's the sort of thing a provider should be jumping on, and a censor bot should be flagging for a human operator to verify and delete or not, depending. So I can see all of that.
Section 53 defines what each bracket of "harmful to children" is, but does not describe what "harmful to children" itself means. Wonder if it's described somewhere else? If not, that leaves an awful lot of open territory. Stay tuned... Subsection (3) - " 'Priority content that is harmful to children' meanscontent of a description designated in regulations mae by the Secretary of State as priority content that is harmful to children." I wonder how much I'd have to dig to resolve this tautology, because defining something as itself is just stupid. Likewise subsection (4) " 'Content that is harmful to children' means (a) primary priority content that is harmful to children, (b) priority content that is harmful to children, or (c) content, not whting paragtaph (a) or (b), of a kind which presents a material risk of significant harm to an appreciable number of children in the United Kingdom." Good as it goes - but, again, what's "harm?"
Section 54 - "Content that is harmful to adults" - is similarly tautological. What's "harm?"
Section 150 - "Harmful Communications Offence (1) A person commits an offence if (a) the person sends a message (see section 153), (b) at the time of sending the message - (i) there was a real & substantial risk that it would cause harm to a likely audience, and (ii) the person intended to cause harm to a likely audiance, and (c) the person has no reasonable excuse for sending the message."
So we have to prove (or disprove!) state of mind at a point in the past, prove (or disprove) a question of intent, and prove (or disprove) justification for the content (this last is probably the easiest of the three to do.) However, we're back to "causing harm" again - I've been thumbing through the instances where "harm" appears as a word or part of a word in this mess, and the title of this section is the 222d occurrence of "harm." I am still looking for a proper definition, instead of the tautology originally offered...
Finaly! Subsecton (4) - " 'Harm' means psychological harm amounting to at least serious distress." Goody. I have problems with this (at the 226th occurrence of "harm," by the way...):
1) It's a horribly nebulous term. What could cause 99 men to go to bits wouldn't even faze me. Perhaps 999 men - I'm pretty unflappable, anymore.
2) This is a horribly nebulous definition in general.
3) What is "serious distress?"
4) We're dealing with a population that goes all to bits if you use the wrong fucking pronouns to refer to them. What else would cause these thin-skinned wastes of space to go pieces?
(Disclaimer - if you haven't figured out by now that I am not politically correct, or whatever term they're using these days, you haven't been paying attention. That's a "you" problem. Deal with it however you see fit, but I won't censor myself to protect fragile sensibilities, I'm going to keep saying - roughly - what I think. I do tone it down just a bit...)
Were I in charge of, say, X; here's how I'd respond to such a bill: "Gentlemen, I have received your law, and I have given it the attention it deserved. Having just swept the ashes into the bin, allow me to reply. Your desire for censorship is a "you" problem, we serve the whole world. You want Great Britain censored, you do it. We will continue to jump on illegal content, and we are accelerating efforts to do so. However, what you consider "offensive" doesn't even flick the needle on my offence meter, and I'm not going to expect people to put themselves in your place to enforce a vague law, rife with tautologies and nebulous critical definitions. Enforcing this will therefore be your problem. Anyone you send to arrest any of my personnel shall themselves be subject to arrest for kidnapping, or at least liable to grievious bodily injury during the attempt to execute such a warrant. And we will vigorously fund their defence. Love, kisses, and fuck you very much. Have a nice day."
Here's another fun one - Section 151, Subsection 1(c) - "A person commits an offence if - ... (c) at the time of sending it, the person intended the message, or the information i it, to cause non-trivial psychological or physica harm to a likely audience ..." How are we defining "non-trivial," and just how nebulous will this end up being? "non-trivial" is anawfully slippery term to use in law, where definitions should be objectively quantifiable.
Section 154, Subsection (3) - "Proceedings for an offence committed under section 150, 151, or 152 outside the United Kingdom may be taken, and the offence may for incidental purposes be treated as having been committed, at any place in England and Wales." Oh, really? So, you're telling me you suddenly have global jurisdiction for the purposes of "offensive content?"
Section 156, adding Section 66!, (1)(a) - "A person (A) who intentionally sends or gives a photograph or film of any person's genitals to another person (B) commits an offence of - (a) A intends that B will see the genitals and be caused alarm, distress, or humiliation."
Now, I'll admit that some things are the responsibility of the provider to strike quickly - and I already covered that (illegal content &c.) However, it is impossible to determine exactly what people will find "offensive" and/or just how "offensive" they'll find it to be, and this is the sort of law that is going to lead to a patchwork of such laws from various countries, making the content filtering by nation resemble a crazy quilt. This is the #1 reason why Meta, X, et al should not do country-level filtering - let the countries do it their damn selves. Akin to the "Great Firewall of China," any country that wants content (of whatever sort) censored can take such responsibility upon themselves, so their people know who they can get mad at. And having a crazy quilt of routing tables gets needlessly complex - having each nation establish its own censorship 1) puts it in the right place (if any place can be said to be "right" for censorship,) and 2) makes administration so much easier for all concerned (especially since each nation can add sites and suchlike to its own routing tables to block, and they don't have to bother service providers.)
This law is horribly written, it's unnecessarily (and unconstitutionally) vague, it puts the subjective burden upon the provider, it assigns worldwide jurisdiction to the United Kingdom (which won't fly,) and it's illogical (reading it made my migraine much worse!)
And its passage will only encourage more of this sort of nonsense, unfortunately. After all, as I've long said - "Nothing has legs like a bad idea."
No comments:
Post a Comment