Facebook Will Tell You If You Liked or Followed Content Made by Russian Trolls During the 2016 Election

Weekly Article
Twin Design / Shutterstock.com
Nov. 30, 2017

As Congress and the court of public opinion have dragged America’s largest internet companies through the ringer over their role in the Russian effort to manipulate voters during and after the 2016 presidential campaign, it’s been tempting—and largely fruitless—for individuals to wonder how they may have been affected. Were those divisive Facebook posts shared by your Uncle Fred cooked up in a St. Petersburg troll farm? You may now find out.

On Wednesday—just before Thanksgiving—the company unveiled plans to inform users if they interacted with Russian propaganda from the Kremlin-backed Internet Research Agency, a Russian troll group working to inflame political unrest online by creating fraudulent accounts and content focusing on the socially divisive topics. The new transparency portal, which Facebook says should go live by the end of the year, will allow Facebook users to know if they liked or followed any of the Internet Research Agency’s posts or pages between January 2015 and August 2017.

This new transparency move is likely in response to pressure from Congress. Sen. Richard Blumenthal, a Democrat from Connecticut, sent letters to Twitter, Facebook, and Google asking the companies take action to provide notice to users about exactly how their time on Facebook was undermined by a Russian troll campaign. Blumenthal gave Facebook two weeks to respond; that deadline was Wednesday.

In September, Facebook shared that operatives from the Internet Research Agency spent $100,000 on thousands of ads that reached about 10 million users, often posing as American advocacy groups that shared memes and images meticulously tailored to rile certain corners of American political life. There was the Blacktivist account, for example, with which Russian trolls posted an image of Black Panthers captioned with “never forget that the Black Panthers, group formed to protect black people from the KKK, was dismantled by us govt but the KKK exists today.” That one post was shared 29,000 times. Another Kremlin-backed Facebook group, Donald Trump America, called for the “disqualification and removal of Hilary Clinton from the presidential ballot” in order to avoid a Clinton dynasty. Facebook said it shared all the ads from the Internet Research Agency with Congress.

The next month, in advance of a trio of Congressional hearings on Oct 31. and Nov. 1 about how Russia exploited Facebook, Twitter, and Google to attempt to meddle with the 2016 election, Facebook shared that the Kremlin-backed ads were actually seen by 126 million users, not the 10 million they'd previously reported. At those hearings, unhappy members of the House and Senategrilled executives from the thee companies to learn if they were either unaware of the extent of Russian government interference on their platforms or just didn’t do much to stop it. Those hearings also led some members of Congress to suggest that companies like Facebook and Google might have grown too big to clean up their own messes—hinting at the need for more regulation of how ads are sold. GoogleFacebook, and Twitter all appear open more explicit rules on this front, and Facebook and Twitter both unveiled new politics about how political ads will be displayed in October in advance of the hearing.

Facebook’s latest transparency effort about Russian propaganda does have limits. For one, according to Bloomberg, Facebook won’t show you posts or pages that may have ended up in your news feed because a friend liked it. Facebook also isn’t showing the content that users may have seen, only the names of pages that made the content, potentially leaving it to Congress to decide whether or not to share the posts it has in its possession. Finally, Facebook is only showing information about Russian trolls affiliated with the Internet Research Agency with this new tool. If you’re fooled by any other Russian government-backed trolls, you’re probably on your own.

This article originally appeared in Future Tense, a partnership of SlateNew America, and Arizona State University.