Facebook-parent Meta to end targeted ads on ‘sensitive’ categories

Meta, the parent company of Facebook and Instagram, announced Tuesday that it will no longer let advertisers target users based on their interest in certain “sensitive” categories — including health, race and political affiliation.

Meta said the move, which takes effect on Jan. 19, came after “feedback from civil rights experts, policymakers and other stakeholders” to prevent advertisers from “abusing the targeting options we make available.”

“We’ve heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups,” Graham Mudd, a vice president of product marketing for Meta, admitted in a blog post announcing the decision.

Other sensitive categories that advertisers will be barred from targeting include ethnicity, religion and sexual orientation, Meta said.

Facebook CEO Mark Zuckerberg speaks to an avatar of himself in the "Metaverse" during a live-streamed virtual and augmented reality conference on October 28, 2021.
Facebook CEO Mark Zuckerberg speaks to an avatar of himself in the “Metaverse” during a live-streamed virtual and augmented reality conference on October 28, 2021.
Facebook/Handout via REUTERS

The ability for advertisers to target messaging to subgroups of users is central to Meta’s sprawling ads-based business.

The feature has also been targeted by critics who say it enables discrimination, abuse and sows division in society by creating various and separate digital worlds for different users, sometimes on the basis of race, political affiliation and other characteristics.

In 2019, for example, the Department of Housing and Urban Development sued Facebook for letting landlords and home sellers restrict who could see ads for properties on the basis of race, religion and national origin.

Facebook CEO Mark Zuckerberg announced the company's name on October 28, 2021 change after several whistleblowers revealed Facebook was censoring data without approval.
Mark Zuckerberg announced the company’s name change on October 28, 2021 after several whistleblowers revealed Facebook was censoring data without approval.
EPA/META HANDOUT

In 2017, ProPublica reported that Facebook’s algorithms created ad categories for people on the platform who appeared to be interested in topics like “Jew hater” and “how to burn Jews.”

More recently, advertisers on Facebook were able to target ads for products like body armor and gun holsters at far-right groups ahead of the Jan. 6 storming of the US Capitol, according to research by the Tech Transparency Project.

In response to backlash, the company has changed its ad-targeting tools over time, removing various classifications, but Tuesday’s announcement is the most sweeping response taken yet.

Facebook received a ton of backlash due to its targeted advertising which was based on health, race, political affiliation and income.
Facebook received a ton of backlash due to its targeted advertising which was based on health, race, political affiliation and income.
Yichuan Cao/Sipa USA

The move will likely hurt the company’s ad-based revenue, which sent the stock down almost 1 percent in premarket trading Wednesday morning. It will also hurt many businesses and non-profits’ abilities to connect with a niche audience on Meta’s platforms.

“The decision to remove these Detailed Targeting options was not easy and we know this change may negatively impact some businesses and organizations,” Mudd wrote in the blog post.

“Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them.

However, the company will still provide a host of impactful tools, including location targeting, to help organizations reach their audiences on the platforms.

“We continue to believe strongly in personalized advertising, and frankly personalized experiences overall are core to who we are and what we do,” Mudd said.

The announcement comes as Meta confronts public-relations crisis after crisis, with whistleblower Frances Haugen slowly leaking thousands of documents showing internal research at the company that highlights everything from the apps’ impact on kids to those with eating disorders.

source: nypost.com