Facebook to tighten ad targeting after antisemitic ‘fail’, says Sheryl Sandberg

Facebook is tightening controls on its advertising targeting tools, chief operating officer Sheryl Sandberg announced in a statement acknowledging that the ability for advertisers to target “Jew haters” until last week was “totally inappropriate and a fail on our part”.

The policy change follows an embarrassing report by ProPublica on Thursday that the company’s ad-buying system allowed advertisers to target users interested in antisemitic subjects. Subsequent reporting found additional bigoted terms in Facebook’s system that could be used to target advertisements.

“The fact that hateful terms were even offered as options was totally inappropriate and a fail on our part,” Sandberg wrote in a Facebook post. “We never intended or anticipated this functionality being used this way – and that is on us. And we did not find it ourselves – and that is also on us.”

The announcement and mea culpa come amid increased scrutiny of Facebook’s advertising tools.

On 6 September, Facebook acknowledged that an influence operation probably based in Russia purchased $100,000 worth of ads on its platform to promote divisive social and political messages. Mark Zuckerberg, Facebook’s CEO, had previously said that the idea that “fake news” on the platform had influenced the outcome of the 2016 presidential election was “a pretty crazy idea”, though the company’s own sales team touts its ability to “significantly shift voter intent” through Facebook ads.

On Wednesday, 20 Democratic senators and representatives wrote to the Federal Election Commission to urge it “develop new guidance” for digital advertising platforms “to prevent illicit foreign spending in US elections”.

Sandberg’s statement did not address lawmakers’ concerns about Facebook’s role in swaying elections, but instead focused on the targeting problem revealed by ProPublica.

The bigoted categories appeared as options for ad targeting because the company used an algorithm to generate the options based on what users wrote in their user profiles about their education and work history. About 2,300 Facebook users wrote on their profiles that they studied “Jew hater” in college, for example, automatically generating the phrase as an advertising target.

Facebook disabled those targeting fields following the ProPublica report, and is now reinstating only those targeting options that have been reviewed by a human, Sandberg said. The company will add “more human review and oversight to our automated processes”, she said, though Facebook did not immediately respond to queries about the specifics of their plans.

Additionally, Sandberg said that the company was creating a system for users to report abuses of the ad system and “clarifying [its] advertising policies and tightening [its] enforcement processes” to ensure that ad targeting was not used in a way that violated Facebook’s “community standards”.

Facebook previously updated its advertising policies in February to address criticism that the ability for advertisers to exclude users based on their “ethnic affinity” enabled advertisers to break US anti-discrimination laws.

Privacy advocates have long raised concerns about the degree of targeting made possible by Facebook’s vast trove of personal data. In May, the Australian reported that the company had told advertisers that it could identify when teenagers were feeling “insecure” and “worthless”.


comments powered by Disqus