TIME100 AI 2024: Kaunas Malgvi
4 mins read

TIME100 AI 2024: Kaunas Malgvi

TIME100 AI 2024: Kaunas Malgvi

Source: TIME photo illustration (Source: Courtesy of Kauna Malgwi)

Kauna Malgwi had just started college at a university in northern Nigeria when the terrorist group Boko Haram launched a major insurgency. The Islamist militants, whose name means “Western education is forbidden,” raided universities like Malgwi’s, kidnapping female students and planting roadside bombs. On one occasion, Malgwi says, one of the bombs exploded just five minutes after her vehicle passed her. The militants particularly targeted Christians like her. In 2012, she fled south to Abuja, the capital, with her mother and aunt.

Malgwi completed her education in Kenya and in 2019 signed up for what she said she thought was a call center job for contractor Sama. In reality, she says, she discovered the job involved working as a freelance content moderator for Facebook, which involved watching videos of war atrocities, rapes, and suicides in order to remove them from the platform. (Sama disputes her allegations. “All team members understood the nature of the work before it began,” a spokesperson said.) The work helped Facebook’s parent company, Meta, train its AI systems to detect similar content in the future. But in 2023, after a colleague exposed low pay and alleged union-busting at Sama, the contractor pulled the plug, and Malgwi and about 260 others were fired. Despite a later Kenyan court ruling that Meta was the moderators’ primary employer, the tech giant is appealing the decision and has yet to pay the severance pay the moderators believe they are owed. Meta did not respond to a request for comment; it says the decision to end its contract with Meta was a business decision, unrelated to the whistleblower allegations.

As the case drags on, Malgwi is emerging as one of the leading voices on the invisible data labor that underpins so much of today’s advanced AI. She leads the Nigerian chapter of the Content Moderators Union, where she sees it as her duty to ensure that young people know their rights when they sign up for jobs at tech companies. That’s crucial, she says, because Meta and others are now looking to contract content moderation contractors in jurisdictions other than Kenya, in the face of legal challenges brought by Malgwi and her colleagues. “The fear that you might get fired for raising concerns is starting to subside, as people start to realize that they have rights as employees,” Malgwi says of the work her team has already done in Nigeria. Her ambition is to spearhead similar efforts across the continent. “Imagine if there was that kind of pressure from all the African countries—Big Tech would have no choice but to do what’s right.”

Malgwi has even helped influence law outside of Africa. In February, she testified at the European Parliament in Brussels, where she told lawmakers about the insomnia she still suffers from because of her job; the paranoia she feels when she sees men with young children after watching videos of child abuse; and the pride she feels in possibly saving lives by escalating terrorist content before attacks happen. She wiped away tears after giving her testimony. Two months later, Parliament agreed to pass the Platform Work Directive, a law that would regulate the employment status and rights of platform workers like Uber drivers and content moderators in EU member states. While Malgwi won’t benefit directly, she sees it as a step in the right direction that could have a global knock-on effect. “It will have a positive impact,” she says. “If this kind of legislation is in place, I’m sure even Facebook knows that one day things will change as well.”

*Disclosure: Sama’s investors include Salesforce, whose CEO is TIME co-chairman and owner Marc Benioff.

Write to Billy Perrigo at [email protected].