Meta’s main content moderation subcontractor in Africa, Sama, will maintain its BCorp certification until the Kenya company’s case over claims of union breakdown and exploitation is settled. The referenced case, which also includes charges against Meta, was filed in May this year by Daniel Motaung, a former content moderator in the East African country.
Corporate responsibility group B Lab told TechCrunch that the decision to maintain Sama’s certification was made after the standards management team completed an initial review of allegations against the company, as documented in a Time magazine article, and in response to similar complaints received through the complaint. method.
BCorp status is a stamp of approval for companies that meet high standards of transparency, performance and accountability, taking into account several factors, including employee wellbeing, company structure and work process. Its status may be one of the reasons Sama says it’s an ethical AI company.
“In cases where legal or regulatory action is possible, B Lab does not conduct independent concurrent investigations. Our complaints process recognizes the power of the legal processes and relies on the outcomes of those rulings,” said B Lab.
“After the outcome of the lawsuit is known, further action may also be taken against Sama in the form of a formal investigation with a decision on eligibility by B Lab’s Standards Advisory Council. This may also require an on-site visit by B Lab to Sama offices in East Africa and interviews with content moderation staff,” it said.
The B Lab said it is waiting to certify new companies that employ content moderators, adding that it will include new risk standards that companies employing content moderators must meet to qualify for the status. .
The new standards include transparency, especially during hiring, access to wellness programs and corporate responsibility to monitor employee health.
Case files show that Sama conducted a “misleading recruitment process” by opening job postings that did not specify the nature of the job successful applicants would do at the Nairobi hub. The moderators come from a number of countries, including Ethiopia, Uganda and Somalia.
According to court records, Motaung, who was fired for organizing a strike in 2019 and trying to unite the subcontractor’s employees, said his job exposed him to graphic content, which has had a lasting effect on his mental health.
The moderators scour social media posts across all of its platforms, including Facebook, to remove those who perpetrate and perpetuate hate, misinformation and violence.
Motuang seeks financial compensation for himself and other former and existing moderators, and also wants Sama and Meta forced to stop disbanding unions and providing mental health support, among other things.
Meta wants the case to be dropped, noting that moderators had signed a nondisclosure agreement that prevented them from submitting evidence against the case.