Facebook Reveals Its Internal Rules For Removing Controversial Content
Facebook utilizes a combination of AI and human moderators to weed out the content that goes against its policies, but those “tools” aren’t always able to define differences in context, humour or satire – which creates an issue and sometimes questionable content slips through the cracks.
Bickert said with the document that has been opened up to the public today, where they will be able to see the rationale behind each policy to give the context for the granular details.
She insisted that Facebook considers the safety of its users to be paramount “and that’s really why we are publishing this new set of community standards“.
According to The Daily Dot, the 27-page document is essentially a blueprint for the company’s moderators, so that posts containing violence, harassment, pornography or abuse can be more easily identified and removed.
What are the consequences for someone found to have posted objectionable content on Facebook?
Facebook has published the secretive rules its 7,500 content monitors use to remove posts likely to promote terrorism, incite violence or breach company policies covering everything from hate speech to child exploitation, sex, bullying and drugs.
As the Post writes, Facebook’s censors, or “content moderators”, have been “chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs”. These decisions are among the most important we make because they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view.
The company advises not to post content that depicts real people and “mocks their implied or actual serious physical injuries, disease, or disability, non- consensual sexual touching, or premature death”.
“Our policies are only as good as the strength and accuracy of our enforcement – and our enforcement isn’t flawless”, writes Facebook’s VP of global policy management, Monika Bickert, in a blog post announcing the company’s internal enforcement guidelines.
“We believe giving people a voice in the process is another essential component of building a fair system”, she added.
Facebook, the world’s largest social network, has become a dominant source of information in many countries around the world. Nevertheless, it’s not enough eyeballs looking at content, and Facebook is prone to making mistakes. I’m married to a boudoir photographer, so my first order of business was to check the nudity rules, which remain largely unchanged, but are far more descriptive than before.
Facebook has published its internal enforcement guidelines and has as well expanded on its appeals process.
The diversity of content reviewers and their bosses has also been called into question. While Bickert said she feared terrorists or hate groups would simply adapt to these new guidelines and sidestep any potential censorship or removal, she is hopeful that the efforts being undertaken will lead to net-positive results.
Here Facebook lays out what it can do to help various users.
Facebook executives say it’s a hard balancing act to weigh what is acceptable expression and what is not.
It gives users a better understanding of the nuances reviewers must recognize to make correct decisions on content that may not be clearly defined in the rules. If these are clear the post will carry a note that it has already been reviewed.
During his testimony on Cambridge Analytica, Zuckerberg was asked by Sen.
Drawing that line appears tricky.
The company said it spots potentially problematic content by using either artificial intelligence or reports from other users.
Attendees included people who specialize in public policy, legal matters, product development, communication and other areas. Some things here stand out as pretty interesting tidbits that many, myself included, may not have been aware of. But he acknowledged there are challenges.