How does YouTube and other big companies deal with reporting.

707 views

Do they have to watch it after it gets a certain number of reports? Is it all algorithmic?

In: Other

8 Answers

Anonymous 0 Comments

Little of column A little of column B actually. Most will use algorithmic first to take a shot at it, if nothing’s found, it can pass on to human review. As for the #of reports I’m sure some companies use that as a metric, but it would be just a company arbitrary threshold and different everywhere.

Anonymous 0 Comments

No one who works outside of Youtube knows the real answer but from watching vids of YTbers talking about how they are fighting off DMCA (Copyright strikes) claims, I suspect that Youtube uses an automated system to deal with reporting.

Basically pretty much anyone can issue a copyright strike against someone’s content for reasons such as the music is stolen, or also pretty common: reusing footage from another YT clip.

In many cases both the usage of music and footage is covered under “fair use” but Youtube doesn’t know or care. All they know is that your content is reported for a DMCA violation so they will take your vid down and wait for you to contest the claim.

Then and only then will you maybe get a human being to actually review the case and determine whether you violated a Copyright claim. That could take anywhere from hours to days or a week+ according to Youtubers. In the meantime the copyright troll has taken your vid down and you aren’t earning views or revenue perhaps on a video covering a very pertinent topic.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

They don’t do much. It’s easily abusable since most of them just do it by how many reports it recieves.

Anonymous 0 Comments

**Please read this entire message**

Your submission has been removed for the following reason(s):

* Information about a specific or narrow issue (personal problems, private experiences, legal questions, medical inquiries, how-to, relationship advice, etc.) are not allowed on ELI5 (Rule 2).


If you would like this removal reviewed, please read the [detailed rules](https://www.reddit.com/r/explainlikeimfive/wiki/detailed_rules) first. **If you believe this submission was removed erroneously**, please [use this form](https://old.reddit.com/message/compose?to=%2Fr%2Fexplainlikeimfive&subject=Please%20review%20my%20thread?&message=Link:%20https://www.reddit.com/r/explainlikeimfive/comments/hg94cc/-/%0A%0AThe%20concept%20I%20want%20explained:%0A%0ALink%20to%20your%20search%20for%20past%20posts%20on%20the%20ELI5%20subreddit:%0A%0AHow%20is%20this%20post%20unique:) and we will review your submission.

Anonymous 0 Comments

It depends what the content is reported for. For violence, hate speech, bullying, etc people are employed/contracted to look at reported content to see if it’s actually violating the standards. If it’s a report for spam or copyright infringement or something like that, it’s more automatic.

For anyone who is interested, here are some links to stories from the Verge about the lives of content moderators at [Youtube](https://www.theverge.com/2018/3/13/17117554/youtube-content-moderators-limit-four-hours-sxsw) and [Facebook](https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona). Warning: this is SUPER depressing to read about.

Anonymous 0 Comments

I used to work for a huge company which does the community moderation for big tech giants as YouTube and Facebook. Our industry is known as BPO – Business Process Outsourcing. A client as big as YouTube can have many smaller projects for different kinds of services (content moderation, ads moderation etc.). I didn’t work on YouTube, some of my friends did, but I worked on Facebook which is pretty much similar in many ways.

There are 2 main methods of community moderation: proactive and reactive. Proactive means you actively and automatically looking for policy infringement from users, by AI (black-listed keywords, image recognition, user behavior pattern etc) or active moderators. On the other hand, what you’re asking is reactive, which means you wait for user to report potential violated content. You don’t have to report more than once for your case to be reviewed, you just need to know how to report correctly! (Yes the report process might not as straight forward as you think). In both methods, the final result is a pool of tickets waiting to be processed.

So how do we process it? If the tasks are simple and well-defined, for example an exact match of a previously detected violated image, the system will algorithmically deal with it according to our policies. If the task are more complicated then it will be escalated to manual reviews and yes, we do have enough man power to watch most of what you reported. Trust me, it’s a massive army! Most of the time the volume of user-reported tickets is steady and manageble, but sometimes it can spike drastically due to bugs, users exploiting tools/bots or trending events. We all have procedures for those outlier situations so they aren’t a big deal. If the workload is reliably increasing, we will hire more people. Tech giants like Facebook and Google are insanely rich, this is just pocket changes to them honestly if you know how much they pay us.

Hope it helped!

TLDR: some of them are dealt algorithmically, but most of the time they are manually reviewed, even after just 1 report.

Anonymous 0 Comments

Depends on if you’re referring to community guidelines strikes or DMCA strikes.

Since DMCA is a legal component and streaming sites are not in a position to determine legal ownership of a video, song, etc. Once someone files a DMCA claim the site will take action until a counter claim is made. After that the results vary.

In regards to community guidelines, the important thing to remember is that context is often a huge consideration in certain things. For example, a medical footage with sufficient description on an account devoted to medical videos may be acceptable even if there’s some level of nudity, but if you took that same footage and gave it the title of “hot boobs”, or if the channel was devoted to nude scenes, or if there’s no supporting description to explain why the footage was uploaded then it may receive a strike. Algorithms are not that great at identifying context, so your major sites will often have an army of human reviewers who are much better at identifying context. The problem with human reviewers is that subjectivity can vary between different humans so you can get decisions that may be a bit odd. For example: a person from a very conservative culture may decide that minimal coverage of a human occurs when someone is wearing a bikini, whereas a person from a more liberal culture may decide that a bikini is considered fully covered. Since major sites employee people from all around the world you can see how a person just joining might make a harsh decision on a video because of their subjective understanding of the term “minimally covered”. Take that along with the fact that reviewers (if not directly employed by the site) are often contractors or vendors, there’s always a new reviewer who will make a mistake on a video and then Reddit will collectively decide that the site itself purposely removed the content, hates free speech and wants to censor everything.

Algos are great at identifying duplicate footage for things that are known violative content such as a commonly uploaded scene from hardcore porn. Also, algos can identify that a scene has been uploaded 100 times and has been removed 100 times, so there’s no need for a human to look at context, might as well automatically remove it. Algos can also determine that the video that has been watched 100000 times before receiving a report so there’s likely no reason to review it once someone flags it for the first time, but say that same video gets flagged 100 more times then maybe a human should review it. On the flipside, if a video has been viewed 5 times and has been flagged 5 times, then a human should probably review it and it should be reviewed sooner than the video with 100k views and 100 flags.