Facebook announced this week that it will remove more than 5,000 targeting options from its custom audience ad tools.
Advertisers on Facebook can use these targeting tools to customize (and personalize) who sees their ads; for example, women in a certain age group, in a certain region, who are interested in a certain category of products.
In the coming weeks, Facebook said in a blog post, many of those targeting options will no longer be available, such as those related to ethnicity, religion, and national origin.
Source: Facebook
The announcement comes after the U.S. Department of Housing and Urban Development filed a complaint against Facebook for allowing discrimination in targeted ads related to housing.
These targeting tools, the complaint says, allow advertisers to violate the Fair Housing Act, which prohibits the discrimination of potential renters or homebuyers by landlords and sellers.
In 2016, ProPublica also identified the ability within Facebook's targeting tools to exclude ads from certain “Ethnic Affinities.”
Source: ProPublica
What This Means for Advertisers
Facebook says that it already requires advertisers within the categories of housing, employment, and credit to complete a compliance certification on its non-discrimination policy.
With these changes, all U.S. advertisers will be required to complete that certification, which they will find in the Ads Manager tool. And until they do, they won't be able to continue to advertise on Facebook.
Eventually, that certification requirement will be rolled out to other countries and advertisers using such other Facebook tools as its APIs.
One of the factors that makes Facebook so valuable to advertisers is the amount of data it owns on its users, from their lifestyle preferences to their political beliefs. (Here's what happened when I downloaded my Facebook data file.)
And while that data isn't personally identifiable to advertisers -- unless it's misused, which is what happened in the Cambridge Analytica scandal that took place earlier this year -- it allows for an arguably unrivaled level of ad personalization.
But to many, the removal of these targeting options is likely to have a generally favorable outcome.
"These changes seem to be aimed at limiting particularly unethical targeting," says HubSpot VP of Marketing Jon Dick, "which will have nothing but positive implications on the user experience."
But How Effective Will It Be?
We started to think about some of the broader problems that could potentially be solved by the removal of targeting options. One of them was the presence of echo chambers on social media: a term describing what happens when users are exposed only to content and conversations that reflect their current points of view or belief systems.
We asked 708 internet users across the U.S., UK, and Canada: If Facebook makes paid ads less hyper-targeted to specific demographics or groups of people, do you think it could help expose users to different points of view?
For the most part, people believe that the removal of these targeting options could solve some problems -- but not all of them.
Data collected with Lucid
The highest amount of optimism came from the U.S., where 37% of respondents indicated that they believe these changes will make significant improvements.
Data collected with Lucid
But in terms of prioritizing the issues Facebook says it's looking to address, discriminatory ads seems to fall to the bottom of the list, with only 4% of respondents on average citing it as the problem to fix first.
Data collected with Lucid
Instead, it seems, data privacy is a top priority across all regions surveyed.
Data collected with Lucid
Still, says Dick, addressing the issue of discriminatory ads is relevant to Facebook's broader scope of issues.
"Advertising on Facebook has been the wild west for a decade. Never before have marketers had so much access to such personal and granular data," he explains. "And just like the wild west, there are 'good guys' and 'bad guys' in its advertising ecosystem."