Olay Teams Up with Data Scientist to Help Tackle Racial Bias Against Black Women in New Initiative
Credit: Joy Buolamwini

Computer scientist Joy Buolamwini knows a lot about racial bias both in and outside the lab. The MIT Media Lab researcher and founder of Algorithmic Justice League (AJL) works to uncover subtle instances of racial discrimination in facial recognition software, but like many Black people, she’s experienced it IRL as well. 

According to Kaiser Family Foundation research, about 7 in 10 African Americans have experienced unconscious racial bias. This discrimination runs the gamut from social media beauty filters favoring white-identified beauty features to prejudiced AI-driven hiring systems, it adversely affects people of color in tremendous ways. 

Olay, a company that has built its brand on inclusive beauty, has partnered with Buolamwini for the  #DecodetheBias campaign to impact meaningful and informed change by helping to triple the number of WOC in STEM by 2030. 

Buolamwini sat down with Essence to share her experience as a Black woman in STEM, her Olay partnership and the upcoming Netflix documentary that follows the journey to dismantling the systems that reinforce racial bias online against Black women. 

You speak about how one moment during your graduate studies changed the trajectory of your career—common facial analysis software did not recognize your face, that of a Black woman until you literally donned a white mask. Could you speak more about this experience? How did it make you feel about your place in the graduate program as a Black woman

Coding in whiteface is the last thing I expected to be doing at MIT, as a graduate student. I initially got into computer science thinking I might be able to escape the spectre of racism, sexism, and all the other isms that prevent so many people from realizing their full potential. Hiding with the ones and zeros I thought I might find some solace from social discrimination, but I was wrong.

Machines reflect both our aspirations and our limitations. My experience of coding in a white mask to have the location of my face detected gave me so much to think about. On the one hand not being detected by machines that can be used for mass surveillance could be a positive benefit, an invisibility cloak to evade state control.  There is a cost to inclusion.

On the other hand, we have seen the wrongful arrest of a Black man based on faulty facial recognition and the erroneous flagging of a young Black girl preventing her from entering a skating rink. When these systems fail, technically there are harmful and long-lasting consequences. I also realized that as one of few Black women studying computer vision, I had an invaluable outsider perspective. My lived experience gave me insights that my more mainstream colleagues ignored or deprioritized.

The issue with the white mask wasn’t just a matter of being detected or not detached, the bigger question was where else is AI being used in society and how might embedded racial and gender biases harm the very people who are least likely to be building these systems in the first place. Who is at risk of becoming excoded -harmed by AI systems? And what can we do about it? 

What is coded bias and where does it manifest? How does it affect Black women on a daily basis, even those outside of STEM professions? 

Coded bias, or algorithmic bias, is everywhere online. It’s when seemingly neutral machines can propagate things like racism, sexism ageism, and ableism which results in harmful discrimination or exclusion for individuals and communities. Algorithms can show unfair biases, favoring who gets hired or who gets access to healthcare, who has loans approved—or who is a “beautiful woman.” From social media filters and apps to search engines, these algorithms are enhancing beauty (like lightening skin or slimming noses) based on a European standard of beauty, leaving many women of color excluded. 

Headlines continue to remind us of the racial bias and gender bias in AI which both harm Black women on multiple fronts from reinforcing stereotypes to denying critical access to economic opportunity. Let’s start with harmful stereotypes which like the elevation of Eurocentric beauty standards which diminishes the dignity and humanity of Black women. For example, major social media platforms have recently been found to have algorithmic bias. This reminds me of my visual poem AI, AIn’t I A Woman?. In that poem, I show AI systems from well-known tech companies denigrating the faces of iconic Black women as well as historic figures. Many of the women are misgendered. In addition to throwing shade,  AI systems can block access to financial resources and job opportunities on the basis of race. 

Additionally, a recent study from The Markup showed that a specific mortgage approval algorithm was 40-80% more likely to deny applicants of color than white applicants with similar backgrounds. Many studies show how algorithmic systems have a gender bias in the kinds of ads that are shown. For example, women in some instances are less likely to be shown ads for high-paying executive positions than men. And beyond racism and sexism, studies also show how these systems can propagate ableism, ageism, and more. Think of an ism and it is probably being encoded into some form of tech. But we have the power to decode the bias and change harmful patterns.

How did you find the wherewithal to push through colorist discouragement? What are some of the ways you have reinforced confidence in your skin? 

I love being a highly melanated woman whose very presence can crack open biases in AI systems. It makes me think of the lyrics from Glory – “Sins that go against our skin become blessings.” As a graduate student at MIT, I was able to show that some of the biggest tech companies in the world had skin type and gender bias in their AI systems. I had the opportunity to show a concrete example of how intersectionality must be part of our analysis of all AI systems.

That research has been a major part of the resistance movement against harmful surveillance technology.  Instead of using that moment of coding in whiteface as an indication to leave the field, it gave me the resolve to stay and make a change. I started the Algorithmic Justice League so that the machines of tomorrow do not make past and present inequalities worse. The fight for racial justice requires algorithmic justice, the fight for gender equality requires algorithmic justice. The ongoing struggle for civil rights must attend to algorithmic justice because we now live in an age where the digital gatekeepers choke opportunities. The fight for algorithmic justice requires often marginalized perspectives which is why I absolutely support Olay’s efforts to close the Gender Gap in STEM fields and send girls to code camp as a beginning part.

Why did you decide to partner with Olay on a project like this?  

I was drawn to them because of their commitment to closing the STEM gap—the brand is committed to doubling the number of women in STEM and tripling the number of women of color in STEM by 2030. I was thrilled to be a part of the #DecodetheBias campaign because it tackles two issues that are very important to me—coded bias – addressed through exclusionary representation in beauty imagery and the need to create more equitable opportunities for young girls of color, especially in the fields of STEM. I remember being a little girl and being teased for my dark skin and being told I was not the standard of beauty, so to have the opportunity to be that face that young girls can relate to is incredible. And I am encouraged to see Olay is taking real action to empower the next generation of girls in code and celebrating both intellect and beauty. 

What do you want us to know about this initiative

Role models are so important and I encourage all of your readers to see the powerful women of color scholars in the film Coded Bias on Netflix to learn both about this issue and see how Black women have resisted oppressive technologies. The end goal of this campaign is to create a digital world that is more representative of us all, and the path to a more diverse definition of beauty requires greater inclusion in the field of computer science.

That’s why I am so excited that Olay is sending 1,000 girls of color to coding camp with Black Girls CODE. And you can help us out! Use the hashtag #DecodetheBias on IG and Twitter to help us send even more girls to code camp. I really commend Olay for holding up the mirror to an industry that they’re a part of, raising the issue, and then taking corrective actions. We have the power to make sure the inequalities of the past are not coded into the technologies of the future.

Loading the player...