“The council is concerned about how the Meta has overridden its economic interests over content moderation”, said the entity classified as independent, but financed by the company.
ADVERTISING
In its report, the council calls for a “significant review” of the double-checking program called “cross-checking” to make it more transparent, responsive and fair.
Currently, when posts or images that potentially violate the policies of Facebook ou Instagram are flagged, they are immediately removed if they are considered too risky and if they come from unknown users. But if the author is known, that content remains online while it is examined further, a process that often takes several days — and sometimes months.
This “uneven” two-phase system “offers additional protections to what certain users express, chosen in part based on the company’s economic interests.” Meta“, details the report.
ADVERTISING
This means that “content identified as contrary to the rules of the Meta remain visible on Facebook and Instagram, while spreading virally and causing potential harm,” the council warned.
The council is made up of 20 international members, including journalists, lawyers, human rights defenders and former political leaders. Its creation in 2020 was proposed by CEO Mark Zuckerberg and is responsible for evaluating the Californian group's content moderation policy.
(To AFP)
Read also