Facebook spokesman Joe Osborne said in a statement that at the time of Allen’s report the company “had already researched these topics”. “Since then, we have been building teams, developing new policies and working with industry colleagues to address these networks. We have taken aggressive measures to combat such foreign and domestic inauthentic groups and shared the results publicly on a quarterly basis. “
In a fact-finding process shortly before publication, the MIT Technology Review found that five of the troll farm pages listed in the report remain active.
The report found that troll farms reach the same demographic groups that were nominated by the Kremlin-backed Internet Research Agency (IRA) during the 2016 election, targeting Christians, black Americans and Native Americans. A 2018 BuzzFeed News investigation found that at least one member of the Russian IRA accused of allegedly interfering in the 2016 U.S. election also visited Macedonia around the appearance of its first troll farms, although it found no concrete evidence of the connection. (Facebook reported that the investigation also failed to establish a link between the IRA and the Macedonian troll farm.)
“It’s abnormal. This is unhealthy, ”Allen wrote. “We have allowed inauthentic actors to accumulate a huge number of subscribers for almost unknown purposes … The fact that actors who have possible links to the IRA have access to a huge number of audiences in the same demographic groups targeted by the IRA poses a huge risk to US 2020 elections ”.
As long as troll farms have been successful in using this tactic, any other bad actor could also continue: “If troll farms reach 30 million U.S. users with African-American content, we shouldn’t be surprised to find that the IRA is currently also a large audience. “
Allen wrote the report as the fourth and final part of a year and a half of effort to understand troll farms. That same month, he left the company, in part because of frustration that management “effectively ignored” his research, according to a former Facebook employee who submitted the report. Allen declined to comment.
The report reveals the alarming state of affairs in which Facebook’s management has been leaving the platform for years, despite repeated public promises to aggressively fight election interference from abroad. A technical review of the Massachusetts Institute of Technology makes it possible to have a full report with the names of employees because it is in the public interest.
His discoveries include:
- As of October 2019, about 15,000 Facebook pages with a predominant audience in the US ran out of Kosovo and Macedonia, which are notorious bad actors during the 2016 election.
- Taken together, these pages of troll farms — which the report considers as one page for comparison — reached 140 million U.S. users monthly and 360 million users worldwide weekly. The Walmart page reached the second largest audience in the U.S. — 100 million people.
- The troll farm pages also merged:
- the largest Christian-American page on Facebook, 20 times larger than the next largest – it reaches 75 million American users every month, 95% of whom have never subscribed to any of the pages.
- Facebook’s largest African-American page, three times the next — reaches 30 million U.S. users monthly, 85% of whom have never subscribed to any of the pages.
- the second largest Indians page on Facebook, reaching 400,000 monthly users, 90% of whom have never subscribed to any of the pages.
- the fifth largest page of women on Facebook, which reaches 60 million US users every month, 90% of whom have never subscribed to any of the pages.
- Troll farms primarily affect the United States, but also target the United Kingdom, Australia, India, and Central and South America.
- Facebook has conducted several studies confirming that content that is more likely to attract users ’attention (likes, comments, and shares) is more likely to be bad. However, the company continues to rank content in users ’news feeds according to what gets the most interest.
- Facebook prohibits pages from posting content simply copied and pasted from other parts of the platform, but does not apply policies to known bad actors. This allows foreign actors who do not speak the local language to post fully copied content and still reach a wide audience. At some point, as many as 40% of page views on American pages are those that show primarily non-original content or material with limited originality.
- Troll farms have previously made their way into Facebook’s affiliate programs for instant articles and ad breaks that are designed to help news organizations and other publishers monetize their articles and videos. At some point, due to the lack of basic quality control, as many as 60% of the instant articles read went to content that had been plagiarized from elsewhere. This allowed troll farms to easily mix and even receive payments from Facebook.
How Facebook enables troll farms and increases their audience
The report specifically addresses troll farms from Kosovo and Macedonia run by people who don’t necessarily understand American politics. However, due to the way reward systems have been developed in Facebook news feeds, they can still significantly influence political discourse.