Illustration: Nick Barclay / The Verge
Meta is mistakenly removing too much content across its apps, according to a top executive.
Nick Clegg, Meta’s president of global affairs, told reporters on Monday that the company’s moderation “error rates are still too high” and pledged to “improve the precision and accuracy with which we act on our rules.”
“We know that when enforcing our policies, our error rates are still too high, which gets in the way of the free expression that we set out to enable,” Clegg said during a press call I attended. “Too often, harmless content gets taken down, or restricted, and too many people get penalized unfairly.”
He said the company regrets aggressively removing posts about the COVID-19 pandemic. CEO Mark Zuckerberg recently told the Republican-led House Judiciary Committee the decision was influenced by pressure from the Biden administration.
“We had very stringent rules removing very large volumes of content through the pandemic,” Clegg said. “No one during the pandemic knew how the pandemic was going to unfold, so this really is wisdom in hindsight. But with that hindsight, we feel that we overdid it a bit. We’re acutely aware because users quite rightly raised their voice and…