On October 20, 2018, just 16 days before that year’s midterm elections, Jim Allen, the spokesperson for the Chicago Board of Elections Commissioners, tried to boost a Facebook post on the board’s page announcing the opening of early-voting locations in the country’s sixth-largest elections jurisdiction.
Boosting a post effectively turns it into an advertisement, and he soon got notice that it would not be approved. He protested Facebook’s decision through an online support portal. The following morning he was told the company had determined the attempted post would be an ad run by a page that wasn’t authorized to run ads “related to politics,” according to emails reviewed by Mother Jones. Allen was informed that in order to comply with Facebook’s policy on political advertising, his page would first have to register to pay for political ads, which would automatically attach a campaign-style “paid for” identifier to content the board wanted to boost.
“For reasons that make absolutely no sense, Facebook is blocking the governmental agency in Chicago that is responsible for conducting elections from announcing the start of Early Voting, wrongly calling this political advertising,” Allen wrote back later that day. The following morning Allen was notified that Facebook had changed its mind and would in fact allow the boosted post, determining it was not a political advertisement under the company’s definition. The delay cost the Board about two days—and set Allen fuming.
Allen would go on to tangle with Facebook twice more at much greater length the following spring, including another debate over whether the company’s policy on political advertisements should apply to a governmental entity like the Chicago board, and another situation when Allen believed voters could be exposed to misinformation about Chicago elections by an imposter Facebook account. His quarrels with Facebook offer a window into the company’s struggles to make thousands of sensitive decisions about what is and isn’t political content, and raise questions about the speed of the company’s response to urgent problems flagged by innumerable elections administrators across the world. Allen warns that the company’s delays in reacting to such complaints could create opportunities for bad actors to disseminate false or misleading claims that could keep people from the polls, even in the face of warnings from multiple federal officials that Russia’s 2016 operation, which harnessed the company’s targeted advertising platform to reach 126 million users, was merely a sign of things to come.
“You could have a situation in a swing state where they target a few specific jurisdictions, put out a bunch of false information about vote-by-mail ballots, early voting, or Election Day,” Allen told Mother Jones. “The time has come where [Facebook] needs to have an election team in place, and it can’t just be algorithms.”
Facebook defends its decision to apply political advertising policies to posts like Allen’s. “We aren’t trying to create needless obstacles for advertisers, but transparency is—and will continue to be—a priority,” a spokesperson told Mother Jones, noting a major component of the company’s paid political content policy—that all such posts are archived by the company in a publicly accessible ad library for seven years.
In fact, the political advertising policies that frustrated Allen have their origins in the public pressure that arose following the 2016 elections. In September 21, 2017, amid growing questions from Congress about how its platform enabled Russia’s operations, Facebook CEO Mark Zuckerberg announced the company would begin requiring those boosting or posting political ads to submit to an authorization process that included verification of identity.
“We know we were slow to pick-up foreign interference in the 2016 US elections,” the company admitted in April 2018 as it expanded the policy to include anyone running “issue” ads. “Today’s updates are designed to prevent future abuse in elections—and to help ensure you have the information that you need to assess political and issue ads.”
The company has faced several bumps in rolling out its political advertising policies, earning complaints from candidates and others that the rules are unevenly or nonsensically applied. As reported by the Tampa Bay Times last summer, in the days after the policy’s debut not only did Facebook erroneously remove three ads posted by former Gov. Rick Scott, but blocked several “share the road” public service messages purchased by the Florida Department of Highway Safety and Motor Vehicles. “We’re working with our vendor to figure out why and the method to their madness,” a department spokesperson said of Facebook. “Obviously, we’re not a political entity.”
One of Allen’s main objections to how the company enforces its policy is that he sees the Chicago board as a nonpolitical entity, and he worries that forcing it to run “political” ads could create voter confusion and push its staff to skirt boundaries on political activity outlined in the board’s ethics policy.
Alex Stamos, the former chief security officer at Facebook, led the company’s investigation into Russia’s 2016 influence operation on the platform. According to Stamos, who now works at Stanford University, Facebook’s policies on political advertising were developed to catch similar efforts. While he wouldn’t comment directly on the Chicago situation, he defended the decisions made by Facebook and other tech juggernauts: “People have to be realistic about the kinds of things Facebook can do at scale.”
As an August 2018 Vice Motherboard piece noted, Facebook’s “two billion users make billions of posts per day in more than a hundred languages,” and its human moderators “are asked to review more than 10 million potentially rule-breaking posts per week.” As Stamos put it, “Facebook is making more decisions per hour than the entire federal court system makes in a year.”
Facebook’s political advertising requirements, and their goal of preventing election interference, were again cited to Allen by a company representative in March 2019 after another posting by the Chicago Board of Elections Commissioners promoting early voting was blocked, this time for municipal elections. Allen said he was frustrated with the company’s approach by this point, as reflected in one email he sent at the time. “Facebook has a history or failing to identify and stop adversarial Russian trolls, but now it will block a Chicago election authority from placing an ad about when and where people may use early voting,” Allen wrote. “This is objectionable in every possible way,” he added, telling the company its policy was “overkill” and “Orwellian.”
“Facebook’s definition of political ads is (as you noted) much broader than most legal definitions—and is designed to prevent foreign elections interference in areas and categories where we have seen it in the past,” responded Eva Guidarini, a member of Facebook’s US Politics and Government Outreach team. “All Elections Commissions/departments/boards across the country face the same rules—and we aren’t able to make any exceptions.”
Allen responded, explaining his complaint that Facebook’s requirements would require the Chicago Board of Election Commissioners to “knuckle under and check a box to buy a political ad, even when the ad is not,” and that it was against the law for the government body to buy political ads. “You’re forcing us to check a box that suggests that we’re violating the law, even though we’re clearly not.”
Guidarini responded that she understood that the post Allen was trying to boost didn’t “fit into a legal definition of political ads, which is why I emphasize that our definition is not a legal one,” and told him that the post would not be marked as “political” in Facebook users’ newsfeeds, it would simply include a “paid for by” identifier.
Rahul Patel, a cybersecurity staffer working with the board, also weighed in on the email thread.
“You have to recognize that the whole process is very badly managed at Facebook,” he wrote. “Even if it is not (a) legal definition, we can’t go convincing everyone who asks us (e.g. Auditors, poll watchers). You would agree that we work for Voters and not Facebook to correct their mistakes.”
Patel asked to escalate the issue, but Guidarini told him that she and another company representative already on the email thread were their best points of contact, saying she would relay “feedback” to “our product teams—but for now, these are the rules.” Allen ended up running the boosted post with the “paid for” language, after acceding to Facebook’s requirements and registering as a political advertiser.
When asked about the company’s interactions with Allen, a Facebook spokesperson said its October 2018 decision to allow the Board’s boosted post to run as a nonpolitical ad was, in fact, a mistake, and that it should have included the “paid for by” language. Its subsequent March 2019 decision to force Allen to classify a boosted post on the board’s page as a “political” ad was the correct one, the spokesperson said, because the posts included language that was “about any election, referendum or ballot initiative, including ‘get out the vote’ or election information campaigns.” The policy is an attempt to require that entities running political ads targeting Americans are based within the United States.
A month after Allen conceded and agreed to post the Board’s notification as a political ad, he reached back out to Guidarini and her colleague Rachel Holland with a new problem: The board had found another page on Facebook that appeared to be impersonating their official page, with the same logo and name. Later that day Holland told Allen she’d look into it. Nearly two weeks later, Allen reached out again, upset that he hadn’t heard anything further: “The impersonation issues that we identified for you 11 days ago are such blatant violations of Facebook’s Community Standards, as published on Facebook itself, that we were fully expecting a timely and straight-forward response,” Allen wrote. Holland wrote back a day later, saying the company was still reviewing the matter.
“We were thinking, ‘Wow, what if this happened at election time?’” Allen told Mother Jones, explaining that he wanted to prevent the page from having any chance to spread incorrect or confusing information that could suppress turnout. Even though he raised Facebook’s own policies about impersonation, Allen complains that the company “took its time reacting and did so rather passively.”
A day after her last response, Holland told Allen that the company had messaged the page’s administrators and told them that Facebook would suspend it in 30 days unless they made “revisions” to clarify that it wasn’t the Board’s actual page. Such steps would include removing the word “official” from the page, specifying that it was a dedicated fan, critique, or satire page, or noting in the page’s about section that it was unaffiliated.
Allen was unsatisfied. “Your list of recommendations enable impersonation,” Allen wrote back, proposing what he had in mind: “Your list of recommendations involve everything but the most apparent remedy… Changing the name of the group so that it’s not confused with the actual Board of Election Commissioners for the City of Chicago.”
Allen told Holland in a subsequent email that when the board ran into the same issue with a seemingly imposter Twitter account using the board’s logo and going by @ChicagoElections, that company “resolved the matter in hours,” almost immediately shutting down the profile.
But Holland and Facebook didn’t respond until May 30, more than three weeks later, when she finally let Allen know that the page administrators she had contacted had failed to make any changes within the 30-day window. The page had finally been deleted. Allen thanked her for the help.
Asked about the incident by Mother Jones, a Facebook spokesperson said the company never found any evidence of policy violations by the group mimicking the Chicago board, but decided to remove the group when its administrators didn’t respond to Facebook’s requests for revisions.
The spokesperson also pointed to a dedicated channel allowing state elections officials to directly contact Facebook to report issues involving voter suppression or related concerns. The channel isn’t available to county or municipal officials like Allen.
“They have this awesome power, and arguably they owe something back to society,” Allen told Mother Jones. “Our hope is, not so much for our own sake but for American elections in general, is that Facebook devotes the resources—and by that we mean human talent—to be able to make decisions quickly and to know what’s political speech and to be able to recognize it.”