We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
Meta, Facebook’s new parent company, announced this week that they’re changing some of the advertising practices on their social media platform. The company said that starting in January, they will remove detailed targeting options for “sensitive” topics, which could be related to “causes, organizations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation.”
In this case, the health topic could refer to diabetes or lung cancer awareness; sexual orientation could refer to same-sex marriage and LGBT culture; and religion could refer to practices and groups such as the Catholic Church or Jewish holidays.
According to the Facebook for Business help center, detailed targeting allows organizations to “refine” the audience of users that Facebook shows their ads to based on demographics, interests, and behaviors.
This follows another recent privacy-related change Meta made when it announced last week that it was deleting Facebook’s massive archive of user face prints, which was used to suggest user tags in photos.
[Related: Facebook archived more than a billion user faces. Now it’s deleting them.]
Regarding this latest update, Graham Mudd, Meta’s VP of product marketing for ads, said in a news release on Tuesday: “It is important to note that the interest targeting options we are removing are not based on people’s physical characteristics or personal attributes, but instead on things like people’s interactions with content on our platform.”
Further, he claimed that personalized advertising experiences “enable people to discover products and services from small businesses that may not have the ability to market them on broadcast television or other forms of media.”
Mudd warned that even with the revised ad targeting, users may still see ad content they aren’t interested in, so the team is concurrently expanding user control options that will allow people to choose to see fewer ads about certain types of content. As of now, users can choose to see fewer ads related to politics, parenting, alcohol, and pets. Next year, they will be expanding the categories to include content such as gambling and weight loss.
This decision, Mudd wrote, was made in response to concerns expressed by experts who worry that this type of ad targeting may harm people in underrepresented groups.
[Related: Congress is coming for big tech—here’s how and why]
Sandra Wachter, an associate professor at University of Oxford’s Oxford Internet Institute, wrote in a 2019 paper published in the Berkeley Technology Law Journal that online platform providers that use “behavioural” advertisement “can infer very sensitive information about individuals to target or exclude certain groups from products and services, or to offer different prices.” This is a concept she calls “discrimination by association.” For example, in the paper, she noted that “dog owners,” which may seem like an innocuous group category, can be used as a proxy trait for lenders to decide who might qualify for a loan application.
After a series of ProPublica investigations starting in 2016 found that certain companies used targeted advertising to exclude users by race and other categories, Facebook said in 2019 that they would no longer allow employers, lenders, or landlords to discriminate against protected groups when it settled with multiple civil rights organizations.
The New York Times reported that “Meta relies on targeted advertising for the bulk of its $86 billion in annual revenue.” Mudd noted that these new changes could impact small businesses, non-profits, and advocacy groups hosted on its platforms, but suggested alternative advertising options such as broad targeting with gender and age, and using Engagement Custom Audiences to reach people who either liked their Page or engaged with their content in the feed.