News

“You have to see it to be it,” is a motto Middle School Coding & Game Design Teacher Sally Miller frequently repeats to her students, and this year the 7th and 8th graders are diving deeper into issues of representation in STEM fields than ever before. In this brand-new unit, students are studying a phenomenon known as “algorithmic bias,” or the ways in which computer algorithms (sets of instructions designed to perform a task) can lead to unfair outcomes and give privileges to one group of users over another.

Above: Joy Buolamwini discusses algorithmic bias at TEDxBeaconStreet during her presentation on Gender Shades.

The class was rooted in the work of the Gender Shades project, which exposes existing discrimination within algorithms across all spheres of daily life, and included a panel discussion with female engineers led by alumna Taylor Daughtery '08. Gender Shades was created by M.I.T. Media Lab computer scientist Joy Buolamwini, a Black woman, who was working on a project using a common facial recognition software package and discovered that her face went repeatedly undetected by the software—that is, until she put on a white mask. She and her team evaluated three facial recognition packages with gender classification systems and found substantial disparities in the accuracy of the classifications; darker-skinned females were the most misclassified group with error rates of nearly 35%.

“I was not aware of algorithmic bias until recently in class [when] we learned how algorithms govern so many different aspects of our lives from behind the scenes, and also how so many people do not know about this yet,” said Sophia Y. ’26. “In my opinion, more people should be concerned about these issues and start to question how big companies are using artificial intelligence (AI).”

“AI touches so many different aspects of life and all different careers,” said Miller. “AI systems like facial recognition software learn from training data, and that training data may reflect the priorities, preferences, and prejudices of the computer scientist. If all these systems are using biased datasets to learn how to interact with the world and make educated guesses about our lives, it can lead to very real problems.”

Above: Taylor Daughtery '08 (top left), Jeannie Nguyen (top right), and Monica Grandy (bottom left) speak to Westridge students.

To help students understand the real-world implications of this issue, Miller invited Westridge alumna Taylor Daughtery ’08, a senior iOS software engineer at Headspace, to the class to participate in a panel of female computer scientists discussing their experiences working in technology. Additional panelists included Daughtery’s Headspace colleagues Monica Grandy (senior software engineer) and Jeannie Nguyen (engineering manager). The panelists talked in depth with students about how it felt to often be the only women in the room throughout their careers, the hurdles they faced, and what a day in their lives was like at their current jobs.

“It was really inspiring to hear their experiences as women in tech and all the challenges they’ve had to face,” said Sandhya R. ’25. “I myself am pretty interested in the computer science/engineering field and I found it intriguing to hear about the things they had to go through.”

Students discussed the “coded gaze,” a term for algorithmic bias coined by Buolamwini, and how it might appear in technology they use daily (such as social media apps, AI like Apple’s Siri and Amazon’s Alexa, and more). Most recently, they watched the new award-winning documentary Coded Bias (now available on PBS and Netflix) that follows Buolamwini and other data scientists, mathematicians, and watchdog groups all over the world as they went public with their findings and testified before Congress to push for the first-ever legislation governing facial recognition in the United States, founding the Algorithmic Justice League.

"To me, artificial intelligence is like a double-edged sword," said Manon I. '26. "We have the brightest minds who created these advanced technologies supposedly to promote a better quality of life for humanity, but unfortunately, there is this hidden risk that these technologies are not developed and applied for what’s best [for everyone]. We should have a robust structure in place to monitor every process for bias and to make sure we all can live in a place safe for everyone." 

“In my class, I always talk about the underrepresentation of women and minorities in STEM and the importance of closing the gender gap,” said Miller. “The folks who are in engineering careers are a very homogenous group and are definitely not diverse enough to reflect the perspectives of the world. Because of that, I knew that bias in technology existed, but I didn’t know the full extent of it until this summer when I came across the Gender Shades project.”