Soon, YouTubers who post politically incorrect or conservative content on the world’s largest video-sharing platform will not be able to do so after the Google-owned site announced that it would take down such videos.
The announcement was made yesterday on the platform’s official blog. The statement reiterated YouTube’s commitment to ‘investing in policies, resources, and products needed to live up to its responsibility to protect the YouTube community from harmful content.’
“We’ve been taking a close look at our approach towards hateful content in consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights, and free speech,” reads part of the blog post.
The platform says it will enforce three primary updates: get ride of hateful and supremacist content from YouTube, cut questionable content and raise authoritative voices and lastly, reward trusted creators and enforce YouTube’s monetization policies.
It is not clear when the site will start purging ‘hate’ content from the site, but the approach to addressing the said issues is not entirely new. In principle, it can be argued that YouTube does not aim to actually delete the content of creators who do not conform to regulations; rather, it targets to ensure that all creators are aware of this development. If YouTubers are cautious that their channels will be brought down if they do not adhere to set rules, then they will hone their content to align with the new regulations. Once creators are aware of this state, then YouTube will not spend a lot of resources policing them because users will voluntarily tame themselves.
“The openness of YouTube’s platform has helped creativity and access to information thrive. It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination, and violence. We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come,” reports the YouTube Team on the post.