By Christina Sturdivant Sani
Coded Bias begins with MIT researcher Joy Boulamwini’s quest to find out why facial recognition technology inaccurately viewed her beautifully-hued brown face. After viewing the documentary with my friend India, a UX designer, and my husband Hamzat, the retail director at a Washington, D.C.-based boutique that centers around Black artisans, we were left with myriad questions about our relationships with technology and how race, class, and social structures influence our ties to tech. Our post-documentary conversation eventually led us to poll nearly two dozen friends to see how their experiences compared to ours (keep reading for the results of our unscientific yet informative survey).
As mid-30-somethings, we grew up in the time of dial-up internet and landline phones. If we missed a call back then, we’d have to check the caller-ID or dial *69. I didn’t get my first cellphone—a silver flip Samsung—until I was a senior in high school. My son, in comparison, received his first cell at 8 years old.
But when the Pew Research Center began tracking Americans’ internet usage in early 2000, about half of all adults were already online. By 2019, nine out of every 10 adults in the U.S. were using the internet, according to the center’s data. Moreover, smartphones have become the primary way to access the internet at home for a growing number of Americans, according to Pew, which dubs roughly one-in-five American adults as “smartphone-only” internet users.
"It turned out these #FacialRecognitionAlgorithms perform better on the male faces than the female faces. They perform significantly better on lighter faces than the darker faces" – @jovialjoy pic.twitter.com/ye6tIre5Zs
— Coded Bias Documentary (@CodedBias) January 21, 2021
For many Americans, the advancement of technology made our lives simpler and more convenient. Though we remember a time when encyclopedias were all the rage, we can’t fathom a world in which a quick Google search wouldn’t render dozens of results to our most random of curiosities. As a journalist, I tune in to my favorite morning news podcast via my phone on a daily basis. Then I check my news aggregator app to see what I missed. I research stories from my phone, contact sources, and share my published articles from the same device.
Both India and Hamzat use facial recognition technology to open their iPhones. India also uses the feature to open bank apps. Meanwhile, Hamzat uses his likeness to pay for sneakers on apps like SNKRS and to text Memojies of himself to our 12-year-old son.
But as essential a tool as our phones may be, the misuse of their embedded technology can have dangerous outcomes. Before watching Coded Bias, we thought little of how facial recognition systems use algorithms to identify faces, and how those algorithms are created by humans—typically white men—who have cognitive biases.
As we wait for legislation that would govern against bias in the algorithms that influence our daily lives, tech companies continue to sell their facial recognition technology to law enforcement.
Given our newfound knowledge and intrigue, the three of us decided to conduct an unscientific survey to gauge how some of our friends and relatives engage with facial recognition technology. As this unregulated form of tech is becoming increasingly embedded in the lives of people across the globe, we wanted to see how people close to us use the technology, how trusting they are of it, and what apprehensions they may foster. After coming up with a few probing questions, I created the survey and we texted it out.
Within two days, we received over 20 responses. About 90% of our respondents were like us: between the ages of 25-44 and identified as Black or African American. Women dominated our sampling with more than 72% of them identifying as female.
The Findings
Overall, our findings showed that most of our friends and relatives are familiar with facial recognition software and the majority of them don’t view it as a threat. They varied more widely when determining the value they’d put on their likeness if asked to use the technology for corporate branding. Here are the results in more detail:
When it came to respondents’ daily use of facial recognition technology, half of them use it to open their phones (14% of those surveyed had phones that didn’t offer that feature, like myself), and 60% use it to open apps and communicate via text and social media.
More than 86% of folks said they never used facial recognition technology in buildings or places like airports. This made us wonder how many of our friends even realize that the technology is reportedly employed for commercial purposes such as tracking people who enter and leave apartment buildings, monitoring the attendance of employees at businesses, and seeing how people respond to ads in real-time.
One of the most assuring responses came with the 81% of respondents who say they’ve never been locked out of their devices because of facial recognition software. Hamzat found this quite hard to believe as he’s struggled through multiple attempts to open his phone: “I figured it was too dark, I didn’t have my glasses on, the angle wasn’t right, my forehead was too shiny, I had a booger in my nose, I needed to drink more water. I didn’t know! I just figured it wasn’t my iPhone’s fault but some other despicable set of circumstances,” he says, during our discussion.
Our next question, which we purposely made open-ended, surveyed how much money respondents would charge a company for the use of their face or likeness. Many respondents chose fixed amounts, ranging from a couple hundred bucks all the way to $100 million. A couple of people said they would not participate for any amount of money, and one person didn’t think their face would even be desired for replication. A small enterprising group considered fluctuating valuations depending on use and profit. Little do they know, tech companies are already selling their faces to the police.
Our final question offered a list of options to see how our friends generally felt about facial recognition technology. Illuminating the thought that societal pressure typically reigns supreme when adopting the latest technology, more than 40% of our friends admitted to their eventual use of facial recognition technology while 27% had already succumbed to it.
Only 10% of our respondents worried it may not be inclusive of all skin tones, body types, and facial structures. A similar question posed to respondents of another Pew Research Survey found that roughly three-quarters of U.S. adults think facial recognition technologies are “at least somewhat effective” at accurately identifying individual people. Just over 60% of people thought the tools are effective at accurately assessing someone’s gender or race.
Joy Buolamwini’s own study on facial recognition bias:
So what does this all mean?
Well, we’ve always known that technology comes with trade-offs but we’re still discovering the reach that facial recognition has on different facets of life. We’d like to think we have the power to ascribe a value to our own likeness, but without regulations, there’s a good chance that our faces have already been captured, sold, or used against us.
Issues of racist and sexist bias are currently bound to the technology that we love for its convenience and entertainment. Fortunately, films like Coded Bias and people like Joy are speaking truth to power and helping everyday folks claim their faces, spaces, and futures for the good of all mankind.
I’ll leave you with this poem written by Hamzat, inspired by Joy Buolamwini and the great Gilbert Scott-Heron.
Artificial Revolution: Decoded
The truth is you can’t code your way out of racism.
The revolution will not come tightly baked into an algorithm allowing others to see you plainly, human.
The revolution will not come in an app that distinguishes your Black child from villain or valueless.
You won’t be able to pop into a cultural isolating soundscape with your AirPods because the revolution will not cancel noise brother, just enhance it.
There will be no App the Revolution, it won’t be available on the Apple Store, Google Store, Amazon store or the stores of ignorance white supremacy has allowed for.
The revolution will be raw, uncut, unable to be loaded down or downloaded.
The revolution will be pure, uncorrupted, unencrypted, untethered unarmed and loaded.
The revolution will require your face, your ears, your mouth, your body
To be yours,
To be put in use to uphold the sacred humanity of those that stand beside and behind you.
Facial recognition will not be your ticket in, sister. No beautifully symmetrical placement of eyes, centering of nose or thinning of lips will guarantee entry.
Your skin, sun-starved or clay-baked will not certify you.
The hair on your pretty little head afro’d, locked, braided or bantu’d, mohawked, blond, red or purpled will not satisfy the requirements of revolutionary understanding.
The revolution will not be coded only for some to decipher in a language unrecognizable to the soul, the mind or the body of the people.
It will not be streamed, flixed, wired or disconnected from the realities of the least of these.
The revolution must simply be.
The revolution is you.
The revolution has always been me. —Hamzat Sani
BONUS: On the subject of poetry, here’s a striking one by Joy Buolamwini herself.
A prolific freelance journalist, Christina Sturdivant Sani has written for nearly two dozen publications including The Washington Post, Zagat, CityLab, Here Magazine, and Washington City Paper. She has covered a multitude of topics including art and culture, food, urbanism, crime, politics, education, commercial real estate, race relations, mental health and wellness, business, and technology. Find her at seesturdi.com.