MU LibraryFINDGET HELPSERVICESABOUT Skip to Main Content

Search Engines, Algorithms and Bias

Podcast Listens

Recommended Videos

"Beyond the Search" tells the story of two leading information researchers who made shocking discoveries about hidden biases in the search technology we rely on every day. It begins when Dr. Safiya Umoja Noble set out to find activities to entertain her young nieces and entered the term “Black girls’’ into her search bar: pages of pornography appeared as the top results. Subsequent searches of “Latina girls” and “Asian girls” led to similarly sexualizing and racist results. Concerned about the effect of such dangerous stereotypes, Noble embarked on research that would lead to her groundbreaking book, ‘Algorithms of Oppression.’ Along the way, she discovered the work of another prominent Black researcher, computer scientist Dr. Latanya Sweeney, who had made her own disturbing discovery: When she searched her own name, she got online ads for access to an arrest record. As Sweeney had never been arrested, she began investigating discrimination in online ad delivery. Her findings astounded her: Searching a name more commonly given to Black children was 25% more likely to deliver an ad suggestive of an arrest record. Both researchers share common concerns about how everyday online searches can reinforce damaging stereotypes, and explore how technology can be made more equitable.
Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.
MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.