Shalini Kantayya’s “Coded Bias” effectively brings to light a modern civil rights issue that can be proven with data—the bias within facial recognition programs, particularly against those who are not like the white men who originally formed such technology. (Such powerful data collecting and distributing technology is used in America by the likes of Amazon, Google, and Microsoft.) But Kantayya’s doc expands this even further, discussing the invasiveness of this technology around the world, and the harm and misinformation it can lead to for people of color in America.
The hero of this documentary is Joy Buolamwini, the Ghanian-American founder of the Algorithmic Justice League. She starts “Coded Bias” by showing a massive discovery that launched dozens of articles, and lead to her speaking at Congress, inspiring activism against this technology. As she sits in her office at MIT, she tells of how she discovered this facial recognition problem, in which the AI did not recognize her face. But when she put on a white mask, it did.
The importance of this is enormous, and director Kantayya spends a fleet 85 minutes detailing why, without losing focus. As facial recognition technology becomes such a global problem, it comes with this harmful bias against people of color, informed by conscious or unconscious biases from those who created such algorithms. It’s not uncommon for this technology to correctly identify a white face, but then to give the wrong information about someone of darker complexion. On top of that, the talking heads in this documentary (mostly women) express how the algorithms themselves are a type of black box, in which we don’t exactly know what they’re thinking, aside from the copious data they contain. We also don’t know what these black boxes are entirely capable of.
One might expect a documentary about data and algorithms to run a bit dry, but “Coded Bias” defies that by having a lot on its mind and by being quick on its feet, hopping all over the country, and the world. As the film builds a damning case against these algorithms and facial recognition, it takes us to Houston, where an award-winning teacher had this job threatened by faulty job-vetting algorithms, and to Brooklyn, where an apartment building has become closely monitored by facial recognition. A focus is also given to how China uses heavily surveillance to control its citizens and their behaviors; in the United States, we might think we’re distant from such a society, but it’s already here. The impressive pacing in this movie is directly correlated to its ideas: it has many of them, and many vivid examples of how the lack of a human’s input has affected others.
It also helps that “Coded Bias” keeps things in the present, which comes from its focus on documenting activism. In London, Kantayya focuses on members of the Big Brother Watch, a type of surveillance watchdog group that takes their name from George Orwell’s 1984. But as the documentary shows in a few gripping scenes, they are a vital force, to both hold police accountable when they try to intimidate a young Black boy on the streets who was noted by surveillance, and to then inform that very suspect about why they were targeted. Activism goes side-by-side with the many sobering details of the film, and it gives the proceedings an invigorating sense that’s mightily aware, but hopeful.
Throughout, “Coded Bias” constantly feels like it’s not recounting a saga that’s like grounded science-fiction, it’s making us aware that we’re square in the middle of one. This attention to the timeline makes the exchange of ideas all the more powerful, seeing how these women have inspired each other. It’s also gratifying when the movie shows the success that has come to the resistance inspired by the work of Buolamwini and other brilliant women. As it shows the faults in such technology advancements, “Coded Bias” exemplifies the power of free will, which includes our right to learn about something, and the right to shut it down.
Now playing in select virtual cinemas.