Facebook has removed 16,000 groups trading fake reviews on its platform after another intervention by the UK’s Competition and Markets Authority (CMA), the regulator said today.
The CMA has been leaning on tech giants to prevent their platforms from being used as thriving marketplaces for selling fake reviews since it began investigating the issue in 2018 — pressuring both eBay and Facebook to act against counterfeit review sellers back in 2019.
The two companies pledged to do more to tackle the insidious trade last year after coming under further pressure from the regulator — which found that Facebook-owned Instagram was also a thriving hub of fake review trades.
The latest intervention by the CMA looks considerably more substantial than last year’s action — when Facebook removed a mere 188 groups and disabled 24 user accounts. Although it’s not clear how many accounts the tech giant has banned and/or suspended this time, it has removed orders of magnitude more groups. (We’ve asked.)
Update: A spokeswoman for the CMA said the question of how many accounts have been banned/suspended in this wave of group takedowns is one for Facebook to answer, adding that the regulator has focused on the removal of groups trading misleading/fake reviews rather than individual accounts — “as this is the most effective way of preventing the trade of such content”. “This is because banned or suspended users could create new profiles, whereas removing the group in which they are trading is more effective in disrupting and deterring this activity,” she added. Facebook was also contacted with questions, but it did not answer what we asked directly, sending us this statement instead:
“We have engaged extensively with the CMA to address this issue. Fraudulent and deceptive activity is not allowed on our platforms, including offering or trading fake reviews. Our safety and security teams are continually working to help prevent these practices.”
Since the CMA has been raising the issue of fake review trading, Facebook has been repeatedly criticized for not doing enough to clean up its platforms, plural. Today the regulator said the social media giant has made further changes to the systems it uses for “identifying, removing and preventing the trading of fake and/or misleading reviews on its platforms to ensure it is fulfilling its previous commitments”.
It’s not clear why it’s taken Facebook well over a year — and several high-profile interventions — to dial-up action against the trade in fake reviews. But the company suggested that the resources it has available to tackle the problem had been strained due to the COVID-19 pandemic and associated impacts, such as home working. (Facebook’s full-year revenue increased in 2020 but so too did its expenses.) According to the CMA, changes Facebook has made to its system for combating traders of fake reviews include:
- suspending or banning users who are repeatedly creating Facebook groups and Instagram profiles that promote, encourage, or facilitate fake and misleading reviews
- introducing new automated processes that will improve the detection and removal of this content
- making it harder for people to use Facebook’s search tools to find fake and misleading review groups and profiles on Facebook and Instagram
- putting in place dedicated processes to make sure that these changes continue to work effectively and stop the problems from reappearing
Again it’s not apparent why Facebook would not have already been suspending or banning repeat offenders — at least, not if it was actually taking good faith action to genuinely quash the problem, rather than seeing if it could get away with doing the bare minimum.
Commenting in a statement, Andrea Coscelli, chief executive of the CMA, essentially makes that point, saying: “Facebook must do all it can to stop the trading of such content on its platforms. After we intervened again, the company made significant changes — but it is disappointing it has taken them over a year to fix these issues.”
“We will continue to keep a close eye on Facebook, including its Instagram business. Should we find it is failing to honor its commitments, we will not hesitate to take further action,”, Coscelli added.
A quick search on Facebook’s platform for UK groups trading in fake reviews appears to return fewer obviously dubious results than when we’ve checked in on this problem in 2019 and 2020. Although the produced results included several private groups, it was not immediately possible to verify what content was being solicited from members.
We did also find some Facebook groups offering Amazon reviews intended for other European markets, such as France and Spain (and in one public group aimed at Amazon Spain, we found someone offering a “fee” via PayPal for a review; see below screengrab) — suggesting Facebook isn’t applying the same level of attention to tackling fake reviews that are being traded by users in markets where it’s faced fewer regulatory pokes than it has in the UK.