By Patricia B. Mirasol

Voter interference and offers to buy or sell votes are not allowed on Facebook, as outlined in a Jan. 28 Zoom call by Meta Platforms, Inc., the Menlo Park-based technology conglomerate that owns the social media platform. Facebook’s latest policy updates also includes its continual removal of coronavirus disease 2019 (COVID-19) misinformation that “could contribute to imminent physical harm.”

“We have policies in place to address the most harmful types of misinformation,” said Alice Budisatrijo, Meta’s head of misinformation policy in the Asia Pacific.

Facebook, however, will not remove all misinformation found on its platform.

“[The reason] why we don’t remove all misinformation is because, as a private company, we are not the arbiters of truth. … We don’t think any single actor should be,” she said.

“It would require us to know ‘the truth’ about every single thing in the world — which is impossible,” she added in a presentation.

Facebook’s voter interference policy disallows misrepresentation of whether a candidate is running or not, as well as misleading information on the details, methods, qualifications, and requirements for voting. It also disallows offers to buy or sell votes, instructions for illegally participating in a voting process, and claims that participation in voting leads to catching a communicable disease.

False claims related to COVID-19’s cures, treatments, and vaccines — particularly those that are unsupported by evidence or have already been debunked — are likewise removed from the platform.

“Community standards are a living document,” Ms. Budisatrijo said. “We continue to update policies as the pandemic evolves.”

Facebook has since removed 24 million pieces of false COVID-19 and vaccine content, with a further 195 million pieces of content having a warning label applied in relation to COVID-19 misinformation. According to the platform, two million people from 189 countries have connected to reliable health information via news feed pop-ups and its COVID-19 Information Center.

The social media network has three pillars for addressing misinformation: removing content that violates community standards; reducing distribution of low-quality content; and informing people by providing context.

In response to a query about the possibility of buying reactions to inflate the survey numbers of specific candidates on Facebook, Ms. Budisatrijo said the platform can already detect and demote content from such behavior. Moreover, while celebrities may change their page names to show support for a specific candidate, Facebook’s policy requires each of its users to carry their true names, and not misrepresent themselves and their intentions.