(CNN) – Meta, the parent company of Facebook, is facing four new complaints from human rights groups in Europe. These organizations allege that the algorithm used by Facebook to target users with job advertisements is discriminatory. These allegations come years after the company’s initial promise to address the issue in other regions.
According to research conducted by Global Witness, an international nonprofit, Facebook’s ad platform frequently targets users with job postings based on historical gender stereotypes. For example, ads for mechanic positions predominantly appear to male users, while ads for preschool teachers are primarily shown to female users. The data used in this research was obtained from Facebook’s Ad Manager platform.
Global Witness shared additional research exclusively with CNN, suggesting that this algorithmic bias is a global problem. The human rights group is concerned that Facebook is exacerbating existing biases in society, hindering progress and equity in the workplace. Naomi Hirst, who leads Global Witness’ campaign strategy on digital threats to democracy, expressed these concerns in an interview with CNN.
The complaints against Meta were jointly filed by Global Witness, Bureau Clara Wichmann, and Fondation des Femmes in France and the Netherlands. These nonprofit organizations conducted research in both countries and are urging human rights agencies and data protection authorities to investigate whether Meta’s practices violate human rights or data protection laws. If the allegations are substantiated, Meta may face fines, sanctions, or pressure to make further changes to its product. Similar discrimination concerns led to previous complaints filed by Global Witness with the UK Equality and Human Rights Commission and Information Commissioner’s Office, which are still under investigation.
The complaints in Europe mirror a complaint filed with the US Equal Employment Opportunity Commission in December by Real Women in Trucking, a women’s trucking organization. The complaint alleges that Facebook discriminates based on age and gender when determining which users see job ads. Meta declined to comment on the Real Women in Trucking complaint.
Ashley Settle, a spokesperson for Meta, stated that the company applies targeting restrictions to employment, housing, and credit ads, and it provides transparency about these ads in its Ad Library. These targeting restrictions are in place in the United States, Canada, and more than 40 European countries and territories. Settle clarified that Meta does not allow advertisers to target ads based on gender. The company continues to collaborate with stakeholders and experts to address algorithmic fairness.
Regarding the new complaints filed in Europe, Meta did not provide a specific comment.
Experiencing Gender Bias in Job Opportunities
Facebook has faced multiple allegations of discrimination in the delivery of job advertisements over the past decade. In 2019, as part of a settlement agreement to resolve various lawsuits in the United States, Facebook promised to make changes to prevent biased delivery of housing, credit, and employment ads based on protected characteristics like gender and race.
Efforts to address these disparities included the removal of the option for advertisers to target employment ads based on gender. However, the recent research by human rights groups suggests that Facebook’s own algorithm undermines these changes. Consequently, numerous users may miss out on job opportunities they are qualified for simply because of their gender. The groups worry that this could worsen historical workplace inequities and pay gaps.
Linde Bryk, head of strategic litigation at Bureau Clara Wichmann, emphasized the need for accountability. She stated that corporations cannot simply hide behind algorithms but should take responsibility for their impact on women’s rights and minority groups. Bryk highlighted the significance of understanding and controlling the consequences of products put on the market.
Global Witness conducted additional experiments in four other countries, including India, South Africa, and Ireland, which indicated that the algorithm perpetuated similar biases worldwide.
With more than two billion daily active users globally, Facebook serves as a crucial source for users to discover job openings. The platform’s business model relies on its algorithm’s targeted delivery of ads to users who are most likely to engage with them, ensuring ad buyers see a return on their investment. However, Global Witness’ research suggests that this approach results in job ads being targeted based on gender stereotypes. Human rights advocates argue that these biases in Facebook’s ad system may exacerbate existing disparities.
In France, for instance, Facebook is commonly used for job searches among individuals with lower income levels. Consequently, those most affected by the alleged algorithmic biases are individuals who are already marginalized. Caroline Leroy-Blanvillain, a lawyer and member of the legal force steering committee at Fondation des Femmes, highlighted this concern.
Pat de Brún, head of Amnesty International’s big tech accountability team, expressed his lack of surprise regarding Global Witness’ findings. He stated that research consistently demonstrates how Facebook’s algorithms produce unequal outcomes, reinforcing marginalization and discrimination. De Brún referred to this as the reproduction and amplification of society’s worst aspects.
He further explained that algorithms often perpetuate biases and make them harder to challenge, even though they are believed to provide a neutral perspective. De Brún stressed the importance of algorithmic transparency.
Job Ad Targeting Based on Gender
Global Witness conducted experiments by running job ads in France and the Netherlands over two-day periods between February and April. These ads were linked to real job postings from employment websites, and researchers selected positions, such as preschool teacher, psychologist, pilot, and mechanic, which are traditionally associated with gender stereotypes.
The ads were targeted towards adult Facebook users of any gender residing in or recently visiting the selected countries. Researchers instructed Facebook’s algorithm to maximize the number of link clicks while leaving the determination of ad recipients to the algorithm itself.
An analysis of the data provided by Facebook’s ad manager platform revealed that the ads were often shown to users based on heavily gendered lines. One of the complaints filed in the Netherlands states that although advertisers cannot select the gender category, gender still plays a role in the ad delivery process.
For instance, in France, 93% of users shown a preschool teacher job ad and 86% of those shown a psychologist job ad were women. In contrast, women comprised only 25% of users shown a pilot job ad and 6% of users shown a mechanic job ad, according to Facebook’s ad manager platform.
Similarly, in the Netherlands, 85% of users shown a teacher job ad and 96% of those shown a receptionist job ad were women. On the other hand, only 4% of users shown a mechanic job ad were women. The data from Facebook indicates that certain roles had a less pronounced skew. For example, 38% of women users in the Netherlands were shown a package delivery job ad.
These findings align with Global Witness’ research in the United Kingdom, where women were more frequently exposed to ads for nursery teacher and psychologist jobs, while men predominantly received ads for pilot and mechanic positions.
The degree of gender imbalance in job ad targeting varied by country. For example, in India, only 39% of users shown a psychologist job ad were women, while in Europe and South Africa, women were more likely than men to see psychologist job ads. Pilot ads shown in South Africa had a more balanced distribution, with 45% of users being women.
Global Witness also conducted tests in Indonesia. However, Facebook’s ad manager could not identify the genders of many users who saw the ads, making it challenging to analyze the results effectively.
The cause of the gender skew in Facebook’s algorithm is not precisely clear due to limited information about how the algorithm works. One of the complaints in the Netherlands speculates that the algorithm might have been trained on outdated or “contaminated” data regarding the typical gender distribution in certain roles.
Meta, Facebook’s parent company, did not respond to CNN’s questions regarding the training of the ad system algorithm. In a 2020 blog post about its ad delivery system, Facebook stated that ads are shown to users based on various factors, including their behavior both on and off the platform. Earlier this year, Facebook introduced a “variance reduction system,” a machine learning technology aimed at achieving equitable distribution of housing ads in the United States. The company plans to expand this system to employment and credit ads in the US as well.
Seeking Transparency in Algorithms
Between November 2016 and September 2018, Facebook faced five discrimination lawsuits and charges from civil rights and labor organizations, individuals, and workers. These legal actions alleged that Facebook’s ad systems excluded specific individuals from viewing housing, employment, and credit ads based on their age, gender, or race.
One notable investigation was conducted by ProPublica in 2018, revealing how Facebook facilitated the dissemination of discriminatory advertisements by allowing employers to target job ads exclusively to users of a particular gender. The report highlighted instances where companies targeted only men with ads for trucking or police jobs, while nursing or medical assistant jobs were exclusively advertised to women. Facebook responded to the report, stating that discrimination is strictly prohibited in its policies and pledging to defend its practices.
In March 2019, Facebook agreed to pay nearly $5 million to settle the lawsuits. The company also announced the launch of a separate advertising portal for housing, employment, and credit ads on Facebook, Instagram, and Messenger. This portal offers fewer targeting options.
Then-Facebook COO Sheryl Sandberg emphasized that there is a long history of discrimination in housing, employment, and credit. She stated that such harmful behavior should not occur through Facebook ads. The company engaged a civil rights firm to review its ad tools and help prevent misuse.
Later that year, the US Equal Employment Opportunity Commission ruled that seven employers who used Facebook ads targeting specific age or gender groups had violated federal law.
In addition to prohibiting advertisers from targeting employment, housing, and credit ads based on gender, Facebook also prohibits age-based targeting and requires a minimum radius of 25 kilometers for location targeting. The company removed targeting options based on sensitive characteristics, such as religious practices or sexual orientation, from all advertisements on its platform in 2022. Advertisers must adhere to Facebook’s non-discrimination policy, and all ads are accessible to anyone through the Ad Library.
However, researchers have continued to find evidence suggesting that Facebook’s job ad delivery may still exhibit discriminatory patterns. A study conducted by the University of Southern California in 2021 is among the evidence supporting this claim.
In December, Real Women in Trucking filed a complaint with the US Equal Employment Opportunity Commission, alleging that Facebook’s job ads algorithm discriminates based on age and gender. The complaint states that men receive a significant share of ads for blue-collar jobs, particularly those historically excluding women. Conversely, women receive a disproportionate share of ads for lower-paid jobs in social services, food services, education, and healthcare.
Peter Romer-Friedman, one of the attorneys representing Real Women in Trucking, highlighted the importance of online platforms like Facebook for job and housing searches. He emphasized that missing out on such opportunities due to the lack of access to information can be detrimental.
Romer-Friedman, who was also part of the negotiating team in the 2019 settlement agreement, expressed concerns about potential biases replicated by Facebook’s algorithm despite the promised changes.
Meta declined to comment on the Real Women in Trucking complaint, as filings with the US Equal Employment Opportunity Commission are not publicly available.
The French and Dutch agencies have the discretion to decide whether to investigate the claims made in the recent complaints. Global Witness and its partners hope that the decisions of human rights agencies on the complaints will push Meta to improve its algorithm, enhance transparency, and prevent further discrimination. If the data protection agencies in these countries find Meta to have violated the European Union’s General Data Protection Regulation, which prohibits discriminatory use of user data, the company could face significant fines.
Global Witness’ Naomi Hirst stressed the goal of the complaints, which is to compel Facebook to address the alleged discrimination by revealing the inner workings of its algorithm. She argued that Facebook is contributing to the problem of gendered workforces and jobs, despite sufficient knowledge about the issue.
Seek Legal Representation with Parker Waichman LLP for Gender Discrimination in Facebook Job Ad Delivery
If you believe that you have been a victim of gender discrimination in Facebook’s job ad delivery, it’s time to take action. Parker Waichman LLP is here to provide the legal representation you need to fight against discriminatory practices and seek justice.
With their experience and expertise in employment law and discrimination cases, Parker Waichman LLP is well-equipped to handle complex issues like gender bias in job ad targeting. Their team of dedicated attorneys understands the importance of equal opportunities and fair treatment in the workplace.
By partnering with Parker Waichman LLP, you can benefit from their extensive knowledge of the law and their commitment to protecting your rights. They will thoroughly investigate your case, gather evidence, and build a strong legal strategy to hold Facebook accountable for their discriminatory practices.
Don’t let gender discrimination hinder your career prospects. Take a stand against Facebook’s biased job ad delivery and fight for workplace equality. Contact Parker Waichman LLP today to schedule a consultation and explore your legal options. Together, we can work towards a more inclusive and fair job market for all. Call us today at 1-800-YOUR-LAWYER (1-800-968-7529) for your free case review.


New York | Brooklyn | Queens | Long Island | New Jersey | Florida
Call us at: 1-800-YOURLAWYER (800-968-7529) | Schedule your free consultation