Since the start of the coronavirus pandemic, Facebook has taken a much stricter stance on health misinformation than it has in the past, removing millions of posts for sharing misinformation. Now, we know just how many accounts, groups, and pages have from the platform for repeatedly breaking those rules: 3,000.
Facebook shared the stat as part of its community standards enforcement report, which measures how the company enforces its rules. The number may seem low given the vast amount of misinformation on Facebook about the pandemic and. The company also said that more than 20 million pieces of content have been removed, and more than 190 million have warning labels between the start of the pandemic in 2020 and this past June.
But the relatively low number of bans — just 3,0000 — tracks with findings by researchers who say that just a are responsible for the vast majority of vaccine mistruths on social media.
During a call with reporters, Facebook’s VP of Content Policy, Monika Bickert, said the company has had to continually evolve its policies. It now removes 65 types of vaccine falsehoods, such as posts saying COVID-19 shots cause magnetism. She also noted that some groups have used “coded language” to evade the company’s detection, which can pose a challenge.
Facebook’s handling of vaccine misinformation has been in the spotlight in recent months as government officials, including President, have said Facebook should do more to counter mistruths about the COVID-19 vaccines. On its part, Facebook says that vaccine hesitancy has declined by 50 percent in the US, according to its surveys, and that its COVID-19 Information Center has reached 2 billion people.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.