Advertisement
The semi-independent Oversight Board seemed to agree with the Journals’ article, writing “we found that the program appears more directly structured to satisfy business concerns” by essentially giving “certain users” extra protection from content moderation. The company had even failed at tracking whether cross-check was more accurate than its automated systems. Further, the board noted the company had lied to it repeatedly about cross-check often giving celebrities a free pass on content, as noted by Facebook whistleblower Frances Haugen in what’s become known as The Facebook Papers.
Essentially, anyone with a strong online presence ended up “whitelisted,” according to the 2021 WSJ article. Anyone on the list was given a full 24 hours to personally take down or change offending content so they could avoid any penalties. Most of those who were on the list didn’t even know it. This system had reportedly included former President Donald Trump before he was eventually banned in 2021. The company has not yet decided if Trump will be let back on Facebook come 2023.
Advertisement
Citing thousands of pages of internal documents and several briefings with company execs, the board said it sometimes took the company “more than five days” before Facebook staff got to review posts under XCheck. All the while, the offending content remained up on the platform. It gave some accounts much more power to violate Facebook’s policies in particular, as the Journal noted the system blocked moderators from removing nude photos of a woman posted by prominent Brazilian soccer player Neymar da Silva Santos Jr. Any other account should have been cited by the company’s policies.
The board’s conclusions come more than a year after it originally accepted Meta’s request to look into its internal systems. The board told Meta it needed to restructure its moderation systems, mostly to make it much more transparent about who is eligible for extra review when the moderation system makes mistakes. The board also said “high-severity” content needs to be removed or hidden while it’s under review. Meta has 90 days to review the oversight board’s opinions and respond.
Advertisement
This report is notable especially since the board usually sides against overt moderation of individual posts. The committee recently told the company to reinstate a post comparing the Russian army fighting in the Ukraine to Nazis in WWII. The board will also give recommendations on whether Meta should rescind its covid misinformation policies. Come 2023, Facebook and Instagram’s hidden moderation policies could look very different.