The UK Government has made several moves towards safeguarding the country’s users from harm in online communities — but much of this legislation targets major platforms rather than the communities that operate on them.
For example, the UK’s planned Online Safety Bill targets platforms “which allow users to post their own content online or interact with each other”.
“Those platforms which fail to protect people will need to answer to the regulator, and could face fines of up to ten per cent of their revenues or, in the most serious cases, being blocked,” it says.
Again, this takes a much broader view of unwanted content, and tends to target the platforms for hosting it. However, if you operate an online community, it is worth thinking about the consequences if this bill happens to pass — like how it might affect the moderation and content policies of online communities you host on major platforms.
The situation in the US is a patchwork of federal and state laws governing online behaviour. But there’s one controversial law with a big influence over how it is managed.
That law is Section 230 of Title 47 of the United States Code, enacted as part of the Communications Decency Act, which provides immunity for platforms when it comes to third-party content posted to it. That means, for example, that Facebook can’t be held liable for defamatory content posted to its platform.
In essence, this means the publishers are treated as distributors of content, not publishers. The platforms are still compelled to remove some content — like copyright infringement and material related to sex trafficking.
Generally speaking, operators of social media pages and hosted online communities are also not liable for what is posted in their communities — unless, of course, it is the operator themselves who posted it.