Supervisory Board criticizes Facebook for giving special treatment to high-profile users: NPR

0

On Thursday, the Facebook Oversight Board found that the social network’s “cross-check” program for high-profile users lacked transparency.

Jeff Chiu / AP


hide caption

toggle legend

Jeff Chiu / AP


On Thursday, the Facebook Oversight Board found that the social network’s “cross-check” program for high-profile users lacked transparency.

Jeff Chiu / AP

Facebook’s supervisory board said in a report Thursday that the social network “has not been fully open” to how it allows millions of prominent users to evade content moderation rules it applies to everyone, a practice known within the company as “cross-checking.”

“The fact that Facebook has provided such an ambiguous and poorly detailed response to a call for greater transparency is not acceptable,” the board wrote in its report. “Facebook’s response does not offer any meaningful transparency on the criteria for selecting which accounts or pages to include in the crosscheck.”

Facebook’s VIP list of celebrities, politicians and more was highlighted by the the Wall Street newspaper, who reported that the social network launched the program as a “quality control measure” for actions taken against high profile accounts. In practice, according to the document, some users are “whitelisted” or virtually immune to any enforcement action.

“The amount of publicly available information on cross-checks is too limited,” said Thomas Hughes, director of the Supervisory Board, in an interview with NPR. “Users need to know what is being done, when and why. If everything is done in an opaque and invisible way, it fuels the belief that something untoward is happening. ”

The Supervisory Board, which Facebook created and funded by an independent trust, includes a group of experts from around the world. It issues decisions that are binding on the company and makes policy suggestions that are not.

In its report, the board said Facebook should publicly explain its rationale for including the accounts in its counter-verification program. The board, at Facebook’s request, will review the cross-checks and post advice on how the system should change.

Facebook admitted, according to the board’s report, that it should not have said that the cross-check only applied to “a small number of decisions.”

“Facebook noted that for teams operating at the scale of millions of content decisions per day, the numbers involved in cross-checking appear relatively low, but acknowledged that its wording could appear misleading,” the report said.

Describing the program as “small,” Hughes told NPR, “was not appropriate,” noting that the program would cover nearly 6 million users. “It’s not small,” he said.

When the board reviewed Facebook’s decision to ban former President Donald Trump, the company was not forthright about the program, only referring to it when asked what type of content rules applied to Trump’s account.

The Board notes that, in the Trump decision, Facebook declined to answer any of the Board’s questions about whether the company had been contacted by politicians or their staff about the suspension of Mr. Trump’s accounts. “, according to Thursday’s report.

The board also noted its precarious role, in which it can only fully assess Facebook’s actions if the company cooperates.

“The credibility of the Supervisory Board, our working relationship with Facebook and our ability to make sound judgments on cases all depend on our ability to be sure that the information provided to us by Facebook is accurate, complete and paint a picture. full topic. at your fingertips, ”the report revealed.

A Facebook spokesperson said in a statement that the company has asked the board to review the cross-check program as it strives “to be more clear in our explanations going forward.”

In its review of the program, the board said it intended to “ensure fairness and objectivity” in how Facebook implements the cross-check. The company has agreed to provide the board with documentation on the operation of the system.

“This should give a better understanding of the work Facebook has already done on a given topic. We will include an analysis of whether Facebook is meeting this commitment in our future transparency reports,” the report said.

The board said it plans to issue recommendations to Facebook on how the system can be changed.

Editor’s Note: Facebook is one of the last financial backers of NPR


Source link

Share.

Comments are closed.