Meta revises content moderation policies on Threads platform in response to user complaints

The head of the social media platform Threads, Adam Mosseri, announced that Meta will make adjustments to the platform’s content enforcement mechanisms after receiving multiple complaints about content moderation decisions.
According to Mosseri, the company has already taken some steps to address the newly emerged issues.
Mosseri’s comments come at a time when Threads users have been increasingly complaining about the enforcement decisions, described as aggressive and sometimes bizarre.
Among the prominent examples, some users have reported facing penalties on their accounts simply for using mundane words regardless of the context.
Mosseri did not precisely explain the reason for these errors, but he hinted that an internal tool at the company may have malfunctioned, preventing reviewers from seeing the “sufficient context” of the posts they were reviewing.
Mosseri wrote on his personal account on the platform: “Regarding those who expressed concerns about the enforcement of rules: we are addressing the matter, we found errors, and we have already made changes. One of the major issues was that reviewers were making decisions without the necessary context. This was a mistake. We are currently working to fix this to enable them to make better decisions, reduce errors, and we are trying to provide a safer experience, and we must improve our performance.”
In addition to content moderation issues, Threads has seen other recent complaints from users, including an increase in the appearance of content that encourages interaction from accounts users do not follow.