Business News

Fb will not advocate well being teams

Groups were a major source of misinformation on the platform.

Free book preview Ultimate Guide to Social Media Marketing

This book takes readers through a 360-degree perspective of corporate social media marketing.

September
17, 2020

3 min read

This story originally appeared on Engadget

Facebook says groups devoted to health-related topics are no longer eligible to appear in referrals. The update is part of the company's recent efforts to combat misinformation.

"In order to prioritize the connection of people with accurate health information, we no longer show health groups in recommendations," Facebook wrote in a statement. Facebook notes that users can still search for these groups and invite others to join them, but these groups will no longer appear in suggestions.

Facebook groups, especially those dealing with health-related issues, have long been problematic for the company. Groups dealing with anti-vaccine conspiracy theories, for example, have also been linked to disinformation from QAnon and Covid-19 – often based on proprietary algorithmic suggestions. Mark Zuckerberg recently said the company will not remove anti-vaccine sites as it does with Covid-19 misinformation.

Regarding QAnon, Facebook says it is taking an additional step to prevent the proliferation of groups associated with conspiracy theory by "reducing their content in the news feed". The company previously removed hundreds of groups linked to the movement, but hasn't completely eradicated its presence.

Related: Why Is Kim Kardashian West Boycotting Instagram Today?

Photo credit: Facebook via Engadget

Finally, Facebook is now archiving groups that no longer have an active administrator. "In the coming weeks we will begin to archive groups that have not had an administrator for some time," writes Facebook. In the future, the company will recommend members of groups without an administrator role before archiving.

Facebook notes that it is penalizing groups that repeatedly share false claims exposed by its fact-checkers, and that in the last year it has removed more than a million groups for repeated or otherwise violating its rules.

Related: Facebook invites college students to a virtual campus

However, critics have long said that Facebook is not doing enough to police groups on its platform that have been linked to disinformation, harassment, and threats of violence. The company came under fire last month after failing to remove a Wisconsin militia group that organized an armed response to protests in Kenosha until the day after a deadly shooting. A number of Facebook groups have been credited with hampering the emergency response to devastating Oregon forest fires after spreading unsubstantiated conspiracy theories about how the fires started. Facebook eventually began removing these claims after rescue workers asked people to stop sharing the rumors.

Related Articles