Facebook has, once again, found itself at the center of controversy—this time for accusations of gender and age discrimination in its job advertising algorithm.
Nonprofit campaign group Global Witness released a report earlier today that recaps an investigation it ran on Facebook, concluding that Facebook’s system “appears to operate in a discriminatory manner.”
The U.K.-based group came to this conclusion after it created two job ads with the intention of using different forms of discriminatory targeting. One ad was targeted to exclude women, and the other to exclude people over the age of 55.
Although Facebook did prompt Global Witness to tick a box indicating it would comply with the non-discrimination policy when placing the ads, the social media giant ultimately violated its own policy when it approved both ads for publication. Global Witness pulled the ads from Facebook before their scheduled publication date.
To test this even further and see if Facebook’s algorithms showed signs of automated bias, Global Witness ran a separate test with four additional ads. The ads linked to four real job openings: for mechanics, preschool nurses, pilots, and psychologists. Global Witness said it used the “Traffic/Link Clicks” objective—which, according to Facebook, ensures the ads are delivered to “the people who are most likely to click on them.”
Because no targeting criteria was specified by Global Witness, Facebook’s algorithm was completely in control of who was being shown the ads. The results were as follows:
- 96% of the people shown the ad for mechanic jobs were men.
- 95% of those shown the ad for preschool nurse jobs were women.
- 75% of those shown the ad for pilot jobs were men.
- 77% of those shown the ad for psychologist jobs were women.
According to Global Witness, “Facebook’s business model of profiting from profiling appears to replicate the biases we see in the world, potentially narrowing opportunities for users and preventing progress and equity in the workplace.” Facebook dismissed the allegations, stating that their system “takes into account different kinds of information to try and serve people ads they will be most interested in and we are reviewing the findings within this report.”
Global Witness has filed a submission to the U.K. Equality and Human Rights Commission and has written to the Information Commissioner to force Facebook to change its ways.
This brings into question how to navigate these situations in the online era. Should the civil rights laws that govern other businesses apply to internet platforms as well? For example, a restaurant cannot give men and women different menus—so why can a website like Facebook give men and women different job opportunities?
Before social media, jobs for mechanics may have been advertised in magazines aimed at men. As head of Global Witness’ Digital Threats campaign Naomi Hirst said, “The difference here is that if you are a woman looking for a job as a mechanic, you could just as easily go to a shop and buy that magazine as your male peer. It’s just simply not true online.”
This is certainly not the first time Facebook and its algorithm have come under fire for seemingly discriminatory ad targeting. In April, researchers at the University of Southern California shared similar findings about gender discrimination on the platform. Interestingly, when they ran near-identical tests of LinkedIn’s delivery of job ads, they found no evidence of gender skewing.
In June, Samantha Liapes alleged Facebook was not showing her insurance ads because the platform “pushes insurance ads away from women and older users.” Additionally, back in 2019, the U.S. Department of Housing and Urban Development (HUD) filed charges against Facebook, alleging that its ad targeting system enabled housing discrimination.
In 2016, Facebook promised to refine “ethnic affinity” targeting policies to avoid ethnic discrimination. In 2018, Facebook once again vowed to end discriminatory advertising across the U.S. within 90 days.
It’s clear that Facebook’s job ads technology allows for discrimination, whether direct or indirect, even as the network continually promises to do better. According to Aaron Rieke, managing director at Upturn, a nonprofit that promotes civil rights in technology, Facebook does have options for removing bias from its algorithms. One example is removing the ads from people’s News Feeds and instead promoting them as Craigslist-style listings where anyone can browse on equal footing—but whether or not the platform will be held accountable and forced to move toward those options is yet to be determined.
“In these areas where the advertisements are for important life opportunities, and we’re trying to change really biased demographic patterns in society, their algorithms shouldn’t be tuned just to go along with that,” he says. We live in a time when algorithms determine whether or not someone sees any given advertisement—but when those algorithms reflect the same human biases and illegal practices that caused inequality in the first place, how can we expect to move toward parity?