Meta promises to fix oversight mistakes in threads.
Adam Mosseri, the president of the social media platform Threads, announced that Meta company will make adjustments to the platform’s content moderation mechanisms after receiving multiple complaints about content moderation decisions.
According to Mosseri’s statements, the company has already taken some steps to address recently emerged issues.
Mosseri’s comments come at a time when Threads users have increasingly complained about aggressive and sometimes strange content moderation decisions.
Among prominent examples, some users reported being penalized for using ordinary words regardless of context.
Mosseri did not specify the exact reason for these errors, but he indicated that an internal tool at the company may have malfunctioned, preventing reviewers from seeing the “sufficient context” of the posts they review.
Mosseri said on his personal account on the platform: “Regarding those who expressed concerns about the enforcement of rules: we are working on it, we found mistakes, and we have already made changes. One of the main issues was that reviewers were making decisions without having the necessary context. This was a mistake. We are now working to fix this to enable them to make better decisions, reduce errors, and we are trying to provide a safer experience. We need to improve our performance.”
In addition to content moderation issues, Threads has recently seen complaints from users about increased rates of content from accounts they do not follow appearing that encourage engagement.