She Spoiled It
  • Home
  • About
  • All reviews

Review: Coded Bias

20/12/2020

 
Picture
Picture
Picture
What if the computers making hugely important decisions about your life were riddled with built-in biases that perpetuated unfairness? And there was almost nothing you could do to stop it?

Well, I hate to break it to you, but the documentary “Coded Bias” highlights the ways in which algorithms exacerbate inequalities in society — and it is chilling.
​The film opens with Joy Buolamwimi, a black woman who is a computer scientist at MIT, explaining how she tried to make a fun interactive mirror, but the camera couldn’t recognize her face. When she put on a white mask, the algorithm in the camera instantly recognized her. The software couldn’t see her real face because it had been taught what white men’s faces looked like.
Picture
This leads her on a journey into the algorithms that control so many aspects of our life but are built on inequalities and biases. The film escalates from there, layering new scenarios on each concept until viewers feel genuine terror and despair at our lives being in the hands of these algorithms.

​Through a series of talking head interviews, illustrations, and documentary footage, this film is educational yet passionate and well-explained. A key aim is to make sure people understand the potential future implications of allowing companies to develop biased programs.

This isn’t just fear-mongering. It’s a clear explanation of why we need to regulate algorithms that have the power to affect our lives to ensure that they are truly fair.

Particularly important to the director Shalini Kantayya (2015’s “Catching the Sun”) is the representation of women and people of color. Many of the film’s examples are about black women in particular, as the group most likely to be negatively affected by such software.

Algorithms decide things like whether people are accepted for loans, college placement, and access to health care or housing. These devices are assumed to be fair because they’re just computers.
Picture
But if programmers feed them data that reflects existing biases, algorithms only learn to replicate or increase those biases. Amazon.com, for instance, discovered that the company’s recruitment tool excluded all female applicants because the vast majority of people for a particular position were men.

A few real-life scenarios are used as object lessons to bring these concepts to life. A facial recognition camera mistakenly identifies a 14-year-old black boy, who feels shaken after police search and question him. These cameras are not accurate enough to rely on, especially for people of color who are more likely to be racially profiled by police.

Even at the more benign end of the scale, algorithms tailor advertisements online based on people’s perceived interests. The more affluent may see ads for items to buy. On the other side of the coin are predatory ads targeting the poor. People with gambling problems are pushed toward betting websites, or people with money problems see ads for extortionate payday loans. 

These real-life examples make for a hard-hitting documentary that demonstrates how this impacts audiences personally.

​One risk with educational, issue-led documentaries is that they leave viewers feeling lectured to, helpless, or frustrated. “Coded Bias” ends on a positive note, but it also definitely touches a nerve. It links inequalities of gender, race, and class with the people who benefit from building algorithms for profit without any regulation.
Picture
It would have been interesting to hear the perspective of developers who are trying to do good. Although the film mentions changes and improvements that tech companies are making, we don’t hear from any of them directly. Are these changes substantive, or are they merely reacting to public pressure?

​Overall, “Coded Bias” makes viewers question things they previously assumed to be safe. Where AI starts to make decisions about our lives without any human intervention, we should challenge it. Where unreliable facial recognition software sends police to our neighborhoods, we should resist. Although this isn’t always possible, perhaps we can take solace in the fact that there are passionate and intelligent experts fighting for regulation and fairness of AI at the highest levels.
Picture
Picture

Comments are closed.

    Author

    Hi, I'm Caz. I live in Edinburgh and I watch a lot of films. My reviews focus mainly on women in film - female directors or how women are represented on screen.
    Follow me on Twitter at @SheSpoiledIt

    I am a regular contributor for In Their Own League​ and most of my reviews can be found there. This site is more of a portfolio to store my work.

    In Their Own League has an amazing team who produce loads of reviews, articles, interviews, filmmaker spotlights and other fabulous content.

    ​Please support them!

    Archives

    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    June 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019

    Categories

    All
    Berlinale 2020
    Blog Post
    EIFF 20
    EIFF 2021
    Female Director
    Femspectives 2021
    GFF 2020
    GFF 2021
    Interviews
    Lff-2020
    Raindance 2020
    Review
    SXSW 2021
    Taiwan Film Festival
    Take One Action Film Festival
    TIFF 21

    RSS Feed

Powered by Create your own unique website with customizable templates.
  • Home
  • About
  • All reviews