当前位置:首页 > 行业动态

Meta's Oversight Board comes out swinging against Facebook's VIP 'cross

The Oversight Board, an independent body set up to review Facebook and Instagram content decisions and policies, slammedthe company on Tuesday over its cross-check program. In its statement, the Oversight Board laid out a number of changes Facebook's parent company Meta should make regarding content moderation across its social media platforms.

Cross-check is an internal Facebook program that was portrayed as a "quality control" measure — a way to double check a content decision for potential moderation when it came to high-profile Facebook users. As Facebook reviews millions of pieces of content a day, the company is bound to make mistakes. The cross-check system was put in place to help limit faulty content takedowns from users deemed a priority to the company.

However, according to a report from the Wall Street Journal, the program basically set up a two tier moderation system for high-profile Facebook users and everyone else. 

SEE ALSO:Facebook reminds employees: You can't fact check Trump now that he's running for president

Basically, thanks to cross-check, celebrities, politicians, and other influencers were able to routinely break Facebook and Instagram's rules without facing penalties like the ones doled out to every other user. As many as 5.8 million accounts made the cross-check white list at one point. These names included former president Donald Trump and Mark Zuckerberg himself.

The Board came out hard in a statement, accusing Meta of not being truthful to them about the cross-check program initially.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

"In our review, we found several shortcomings in Meta’s cross-check program," writes the Oversight Board. "While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns." 

While the Board said that it understands that Meta is a business, it failed to uphold its own content policies, failed to track data on the program's results, and failed to be transparent about the program.

The cross-check program first came to light when Facebook whistleblower Frances Haugen shared internal documents regarding real-world harm stemming from the social media platform. Haugen briefedthe Oversight Board on cross-check, among other issues revealed in the documents.

SEE ALSO:Meta Oversight Board finds plenty of flaws with Facebook's content moderation

The Oversight Board suggested a number of changes to the program, mostly surrounding transparency. For example, the Oversight Board said that Meta should mark the accounts of users that are part of the cross-check program. The Board suggested that this would "allow the public to hold privileged users accountable for whether protected entities are upholding their commitment to follow the rules."

In addition, the Oversight Board recommended that Facebook still take down certain "high severity" content, regardless of a user being part of cross-check. If a user whitelisted by cross-check continuously breaks the rules, the Oversight Board suggests that Meta remove their account from the program.

While Meta acts on the Oversight Board's rulings on specific content moderation decisions, such as the reinstatement of a specific post from a user, the Board's policy change recommendations are just that: recommendations. Meta is not bound to listen to the Oversight Board's suggested changes to the cross-check program.

TopicsFacebookInstagramSocial MediaMeta

分享到: