Facebook prepares to combat misuse of its platform in the 2020 Myanmar election

Facebook prepares to combat misuse of its platform in the 2020 Myanmar election
Facebook logo. Photo: EPA

Facebook has said in a statement that it was preparing for Myanmar's general election in November by improving the detection and removal of hate speech, content that incites violence, and prevents the spread of misinformation.

Rafael Frankel, Director of Public Policy Southeast Asia, Emerging Markets, explained recently.

Facebook continues to focus on their responsibility to ensure the integrity of Myanmar’s election on their platform.

In a statement he points out, Facebook recognizes Myanmar’s complex social and political context and is sensitive to the tumultuous changes and the serious violence that took place since the country’s last election in 2015.

According to the statement, the platform plans to combat misuse of its platform by:

Preventing Voter Suppression, for example, it would remove posts falsely claiming a candidate is a Bengali, not a Myanmar citizen, and thus ineligible.

Combating Hate Speech, it has invested significantly in proactive detection technology to us catch violating content more quickly and uses AI to proactively identify hate speech in 45 languages, including Burmese.

Making pages more transparent, it also wants to make sure people are using Facebook authentically, and that they understand who is speaking to them. So far, more than 40 political parties have been given a verified badge. This provides a blue tick on the Facebook Page of a party and makes it easier for users to differentiate a real, official political party page from unofficial pages.

Limiting the spread of misinformation, it has introduced an Image Context reshare product in Myanmar in June. With this product, users will be shown a message when they attempt to share specific types of images, including photos that are over a year old and that may come close to violating Facebook’s guidelines on violent content.

Messenger forwarding limits, has introduced a new feature that limits the number of times a message can be forwarded to five. These limits are a proven method of slowing the spread of viral misinformation that has the potential to cause real world harm.

Third-party fact-checking, It now has three fact-checking partners in Myanmar – BOOM, AFP Fact Check and Fact Crescendo.