Facebook’s Community Standards are now available in Khmer language

Every day, people use Facebook to share their experiences, connect with friends and family, and build communities. Since the social media should be a place where people feel empowered to communicate, Facebook focuses on keeping abuse off the platform and developed a set of Community Standards that outline what is and is not allowed on Facebook.

These Community Standards are now available in the Khmer language.

So what are they all about?

Facebook Community Standards are based on feedback from the social media community and the advice of experts in fields such as technology, public safety and human rights. They are designed to be inclusive of different views and beliefs, in particular those of people and communities that might otherwise be overlooked or marginalized.

Facebook focuses on five core values that are considered during the development of Community Standards, voice, authenticity, safety, privacy and dignity.

Community standards are being constantly updated according to the changes in online behaviour. Facebook’s team focused on content policy is based in over 11 offices around the world, and is made up of experts on diverse topics such as terrorism, hate speech, and child safety. They cover a wide range of policy areas to catch all kinds of harmful content, from bullying and harassment, to graphic violence and credible threats all the way to fake accounts, fraud and impersonation.

To enforce these policies, Facebook uses combination of reports from their community, reviewed by their teams, and technology to identify and review content against the standards.

Anything that is on Facebook can be reported – page, profile, post, photo, comment, and anyone can report content they believe violates the standards. People can also customize and control what they see by unfollowing, blocking and snoozing people, and hiding posts, people or pages.

Facebook relies primarily on artificial intelligence to detect violating content on Facebook and Instagram. Algorithms are getting better all the time at identifying content that violates the Community Standards and automatically take it down before anyone sees it.

95% of the organized hate, firearms, adult nudity and sexual activity, drug sales, suicide and self injury, fake accounts, terrorist propaganda, child nudity and exploitation, and violent and graphic content removed from Facebook is found proactively using technology.

But there are still many cases in which a human reviewer is critical to enforcing the standards fairly and accurately, for example in case of hate speech. The systems can recognize specific words that are commonly used as hate speech, but not the intentions of the people who use them. So people have to review such content.

Around the world, there are over 35,000 people working in safety and security at Facebook, of which, about 15,000 are content reviewers. This team reviews content 24/7 and includes native language speakers. The reviewers undergo extensive training when they join and throughout their time working for Facebook. They have different kinds of support including access to wellness and psychological support teams.

To show how Facebook enforces their policies, the company regularly publishes Community Standards Enforcement Reports, which share numbers on the content found violating the Community Standards.

To learn more, you can visit Facebook’s Community Standards here.