Facebook beefed up automated tools on Wednesday to help group moderators who strive to keep the exchanges civil at a time of conflicting views.
The leading social network hammered by critics over vitriolic content in news feeds has presented groups as enclaves where people with differing opinions on hot issues can bond over shared interests, music to hobbies or parenting.
“Some of these groups are millions of people,” Tom Alison, Facebook vice president of engineering, said in an AFP briefing on the new tools for moderators.
More than 1.8 billion people use groups every month, and more than 70 million administrators work to “keep conversations healthy” in forums, according to Alison.
Although much of the content is not public in the groups, Facebook has sought ways to control hateful and abusive content in these forums.
Facebook’s automated systems already check for posts in groups that violate the social network’s rules for acceptable content.
A new “Admin Assist” feature allows moderators to set criteria for what is considered acceptable within the group, then automatically check posts or comments for violations.
Moderators can also use software to eliminate comments that contain links to unwanted content; slow down heated conversations to keep a cool head, or require people to be members of the social network or a group for a period of time before they can participate in the conversations.
“What these tools do is automate things that admins have done manually, but don’t expose anything they didn’t have access to before,” Alison said.
Facebook is testing artificial intelligence that can monitor indications of unpleasant conversations, perhaps with quick responses or controversial content involved, then alert moderators.
“The AI is looking at things associated with threads that have conflicts,” Alison said.
“Some administrators welcome people having debates; others don’t want controversial conversations. “
According to moderator Brian Anderson, a group called Dads With Daughters from the nonprofit Fathering Together are among those who have early access to test the tools.
The online community for fathers sharing tips, resources and pride in raising their daughters has over 127,000 members.
“At first there was really nothing, you just put rules there and what a band is,” Anderson said of moderation at the start.
The tools added by Facebook have reduced the number of moderators needed for the group, according to Anderson.
The strong positions of moderators against “toxic male tropes” such as shotguns to protect daughters from suitors have helped establish the tenor in the community.
“You can tell the groups that are really trying to keep civilian space from posts that get nasty right away.”
gc / rl