Facebook's crackdown on dangerous content in groups could backfire, experts say | Technology


Facebook is changing its rules on private groups amid growing criticism that some closed communities on the platform are uniting extremists and spreading fake news.

The company announced in a blogpost on Wednesday that it would take a more “proactive” approach in detecting problematic content in groups and will work to enhance transparency surrounding the communities.

“Being in a private group doesn’t mean that your actions should go unchecked,” Tom Alison, Facebook’s vice president of engineering said in a statement.

Facebook said it has built a new tool called Group Quality that will use artificial intelligence to scan groups for content that violates community standards. It will also give administrators more power over what goes into the groups and insight into why posts are removed through its new Group Quality tool.

Facebook will also change privacy settings to allow administrators to make groups either “public” or “private”, a move that would remove certain groups from search results.

But rather than helping to curb the rise of hate speech and misinformation, some critics worry, the changes may help push questionable content further underground. Facebook should give independent researchers access to similar tools so they can gain insight into what is happening in private groups, said Benjamin Decker, CEO of the digital investigations consultancy Memetica.

“Many of the groups I study are now changing their privacy status to be closed and not visible, making it more difficult for content violations to be identified by outsiders,” he said. “I am very concerned this will further allow conspiracy communities and violent extremists to further obfuscate their activity.”

The new rules also rely on administrators of groups, who police content in these micro-communities, to help flag questionable content.

“Putting the power in the hands of moderators creates a ‘by us, for us’ mentality,” Decker said. “This model for tracking violations is problematic because these administrators have zero incentive to moderate in favor of a cleaner internet – rather, they enforce biases.”

Facebook pushed its group and “community” features in a series of advertising campaigns in early 2019. But the rise of groups also allowed insulated communities to spread fake news and hate speech to millions of users. The company has faced repeated calls to curb the rise of anti-vaccination groups, where users can easily post misinformation about public health without repercussions.

Facebook has changed rules around groups several times in recent years, and new policies may represent a step in the right direction, said Sharon Kann, head of the Abortion Rights and Reproductive Health research team at Media Matters, but it remains to be seen if they will make a difference.

“We’re hopeful that this change means Facebook is taking seriously the spread of misinformation and harassment on the platform – something we know has continued in spite of other policy changes,” she said. “Once again, the question remains whether Facebook is committed to actively enforcing these policies when content or activity is in clear violation.”



Source link