The EARN IT Act

343 views

I understand it is has something to do with online platforms facing liability for criminal user content (especially with regards to child pornography). I’ve heard that it will cause major privacy issues though. Here are my questions:

How specifically does it affect platform liability? How does it affect online privacy? What is section 230?

Not looking for opinions on whether the act is good or bad, just want to understand it.

In: 4

4 Answers

Anonymous 0 Comments

Section 230 allows companies to not be held responsible for what users do on their platforms

Anonymous 0 Comments

Depending on the political climate, you either use terrorism or child pornography as a vehicle for your goals. The goal of the Five Eyes is to get rid of end to end encryption in our daily, private communication.
Then, you add additional points to such a bill to make it less obvious.

Anonymous 0 Comments

Pre-internet, user-generated content on someone else’s platform wasn’t really a thing. You could self-publish (if you had the money), or you could try to convince some media company to share your stuff. Either way, whoever published it was responsible for the content…if you publish kiddie porn, you get busted. If you convince a media company to publish kiddie porn, they get busted. The idea was that the publisher has a responsibility to assess the legality of the content.

However, under the internet model of user-generated (and moderated) content, that’s just not practical…can you imagine if someone(s) at YouTube had to review each video and figure out if it was a copyright violation, or a porn violation, or a criminal act (all of which varies by country and age of participants) before it posted? The most popular social media sites all literally can’t fully review all their user-uploaded content, so who gets the liability? Section 230 sorted that out…the liability is on the content creator, *not* the publisher. This is a really unique carve-out for internet companies, traditional media like TV stations or newspapers can’t pull this off.

It has enabled explosive growth of user-generated/moderated content that is the backbone of the modern internet. It has also allowed the major hosting platforms to completely abrogate any responsibility, legal or otherwise, for the damage done on and by their platforms.

EARN IT is an attempt to specifically address that for child pornography…it lays out the framework to establish the guidelines that the platforms have to follow and, crucially, allows governments to sue the *platform* when they don’t (under Section 230, they couldn’t usually do this before).

Anonymous 0 Comments

What they are putting on the sticker is that the act toughens some of the laws regarding child abuse-related content online and makes it clear that several laws apply to ANY abuse, not just pornography. They go further and argue that social networks like reddit or twitter should be responsible for bad content that users post.

The problem is this accomplishes those goals by doing some things that serve some nefarious ends.

It goes so far as to argue that if encryption is being used and cannot be broken by the company, that is being *complicit* with posting bad content. So an app like Signal that promises you can communicate with encryption in a way that nobody can crack would be damn near illegal. All it takes is for ONE person to post what we call “illegal content” on the network and suddenly ALL content is suspect and the target of a lawsuit.

Some people don’t like it because the definition of “child abuse” is really subjective. Some people argue letting a child identify as transsexual is “child abuse”. If that gets legal precedent, then a parent could sue Discord for allowing servers that discuss resources for transsexuals without verifying everyone is not a minor. Worse, the companies have to be liable for scanning *private* communications, not just public communications. So if any random user on Twitter sees a person talking about struggling with gender identity and DMs that person information about it, that could be construed as “a violation” and Twitter could face criminal liability if they do not report everyone involved to the government.

So the worst-case is operating any kind of private, encrypted communications as an online service in the US either requires you to spy on EVERYTHING posted by every user and report ANYTHING that could qualify as criminal to the government or, more likely, just don’t allow any encrypted or private communications at all and severely restrict the content that can be posted publicly. Right now even if someone subpoena’s some private encrypted information, it’s legal for it to be encrypted in a way that the company cannot decrypt it. Under this law, ALL encrypted content on websites would have to be done in a way that the company could decrypt it so the government can demand it be handed over for investigation. That will apply not just to bad, illegal content, but any other content.

Some grumble this sounds an awful lot like some politicians are upset that encrypted communication tools can be used by their opponents to organize protests, spent a long time in 2016 through 2020 trying to remove social networks’ protection from liability so they could spy on this content, and have decided to propose that idea under the guise of “protecting children from abuse”.

## TL;DR:

This is a law that notes child abuse usually happens in the dark and behind closed doors, so it makes it illegal for anyone to turn off their lights, illegal to have doors installed in doorways, and legal to sue a company who has ever received a piece of mail that mentions light switches or doors unless they immediately report it to the authorities.