These YouTube "monitors," as Wojcicki referred to them, are tasked with watching potentially "misleading" videos all day long in order to determine whether or not they should be censored or removed from YouTube. These include videos that question the official Sandy Hook narrative, as one example – and as of December 10, this will also include videos and channels that aren't commercially viable.
According to Wojcicki, YouTube's 10,000-strong army of censors has already been successful in cutting the amount of time that Americans watch "controversial content" on YouTube by 70 percent – but this has apparently come at a serious cost to their mental health and well-being.
"Our reviewers work five hours of the eight hours reviewing videos," Wojcicki told Stahl in response to Stahl's suggesting that it must be "very stressful" to have to be "looking at these questionable videos all the time."
"They have the opportunity to take a break whenever they want," Wojcicki added, noting that YouTube has also hired counselors and therapists to help "treat" these human censors when the content they're forced to watch negatively affects their mental health.
"We work really hard with all of our reviewers to make sure that, you know, we're providing the right services for them," Wojcicki delineated about what YouTube is doing to maintain its censorship army and keep the soldiers happy.
For more related news about Big Tech censorship, be sure to check out Censorship.news.
Hilariously, there's also a segment of this YouTube censorship army that's reportedly buying into some of the "conspiracy" videos that YouTube executives are having them watch and censor.
As it turns out, some of the concepts being presented in the "controversial content" on YouTube actually makes logical sense. And if YouTube would simply allow its users to trifle through it and make up their own minds, the whole matter would resolve itself without the need for thousands of human censors, therapists, counselors, and everything else that's associated with trying to maintain a digital police state.
But YouTube doesn't actually want its users to think for themselves. It would rather police all of the content that flows through its platform and decide what's "acceptable" for people to watch, effectively shaping the narrative from the top down.
And because of Section 230 of the Communications Decency Act (CDA), YouTube and the other tech giants continue to maintain legal immunity against the content that's on their platforms, even as they engage in selective censorship – which, if we actually lived in a country where the law was enforced, would nullify Big Tech's legal immunity.
"No matter how harmful or untruthful, YouTube can't be held liable for any content, due to a legal protection called Section 230," Stahl told Wojcicki.
"The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn't you be held responsible for that material, because you recommend it?" she further asked, to which Wojcicki responded to the contrary.
"If we were held liable for every single piece of content that we recommended, we would have to review it," Wojcicki stated. "That would mean there'd be a much smaller set of information that people would be finding. Much, much smaller."
Brighteon.com has emerged as the free speech alternative to YouTube, and is now used by thousands of channels who have posted over a hundred thousand videos.
Sources for this article include: