In an AI-obsessed world, Safiya Noble urges us to hold on to our humanity.
Photo: Courtesy of Dr. Safiya Noble
As AI dominates headlines and unfurls like wildfire across the Internet, Dr. Safiya Umoja Noble stands ready to remind everyone that machines are not better than humans — and never will be.
Noble is a tech justice advocate, an esteemed scholar and a professor of Gender Studies and African American Studies at UCLA. Her work focuses on the intersection of technology and human rights as she tries to protect vulnerable communities from the effects of AI and the ways it can exacerbate inequities like racism and sexism.
She started down this path in 2011, ahead of a visit from her nieces, who would use her laptop when they stayed at her house. “I would often try to search for things I thought they might search for; I googled ‘Black girls’ to see what would happen, with a hunch it would not be good,” Noble tells InStyle. “Eighty percent of the first page was porn.” She googled “Latina girls” and “Asian girls” and got similar results.
Noble started asking friends and family if they knew searches were returning this sort of content, and she was frustrated that people weren’t surprised. “It was almost normal and expected,” says Noble. “People were like, ‘Well, of course you’re gonna get porn when you google Black girls.’”
People are using these search engines like they’re truth tellers, like fact checkers.
“That was when I was like, Oh, hell no, we’re just okay with that? We’re not going to try to fix it? That was definitely the wake up call that I needed to work on it. People are using these search engines like they’re truth tellers, like fact checkers.”
Noble had previously worked on African American representation and access in politics and in marketing for more than 15 years, so she was already oriented on issues of injustice facing Black communities. At the time of her Google search, she was in a library and information science Ph.D program, thinking about those issues in the context of the internet and knowledge.
Noble wrote a book, Algorithms of Oppression: How Search Engines Reinforce Racism, about the ways search engines misrepresent people, information, and knowledge, particularly for people who are marginalized. It was a groundbreaking contribution — that an algorithm is coded with value judgments that reflect social oppression.
Safiya Noble at the NAACP Image Awards.
Her book was published to critical acclaim and she won a prestigious MacArthur “Genius Grant” in 2021 for her work on algorithmic discrimination. Google changed the way it conducted its algorithm. Prince Harry and Meghan Markle presented her with the NAACP-Archewell Digital Civil Rights Award. “Safiya’s work speaks to a new chapter in the movement for civil rights,” said Harry in 2022.
“The problem is that we have racism embedded in the code,” says Noble today. And what is true of search engines is also true of AI large language models, like ChatGPT, that are rapidly changing the way decisions are made on a daily basis — in government, in corporations, and in our personal lives. AI is based on training data that is racist and sexist, so the results can be very problematic. Noble jokes that she used to need a T-shirt that said, “Men shout at me at conferences” because men used to tell her coding is math, which can’t be racist or sexist. But now that AI is increasingly being discussed, the way it can harm people is starting to get more attention; Noble is often at the forefront of these conversations.
“The thing about technical, mathematical, digital systems is that they always get narrated as being objective, fair, purely math, or as inevitably better than how humans think,” says Noble. But there are a number of ways datasets can reflect historical inequality, or categorize people in a way that is exclusive, or put people in danger. For example, AI technology has been proven to result in racial profiling, affecting people’s ability to get mortgages approved or pass job screenings for interviews. Facial recognition technology has misidentified people with darker skin tones, Black women in particular, which has resulted in wrongful arrests.
“People experience systems and they don’t understand what happened: Why was I denied a job? Why was I not afforded an opportunity? They just ascribe it to their own personal failure. That is insufficient and deeply unfair and likely violating our rights.”
Noble consults for the White House, the Office of the Vice President, the Federal Trade Commission, and more, centering policy that foregrounds humanity over technology. She also runs a foundation focused on civil rights and technology called the Center on Race and Digital Justice (CRDJ). The center has awarded more than $600,000 in funding to support researchers and activists focused on impactful technology that serves the public and promotes equity, while calling out harmful technology that may exploit the public, like data brokers invading people’s privacy. Two of CRDJ’s current projects are the 1890 Project, which is building a new AI technology platform for HBCUs that can protect the intellectual property of Black faculty and students, and the Cyber Collective, which is a group of women of color who are innovating around cybersecurity and protecting everyday people online. CRDJ is hosting its first conference in May to coincide with Noble’s chair endowment at UCLA.
Safiya Noble (left) speaking on a panel alongside two other women in front of a sign for Dove
Her core belief that knowledge filtered through technology is problematic and undermines our humanity is tied to her faith that the more people learn about others who are different from them, the more empathetic they will be.
Noble points to book bans or the elimination of Ethnic Studies from college campuses as examples of the way society is controlling knowledge for the worse. “It’s getting harder and harder to have the knowledge we need to learn about each other and care for each other and respect each other and know where people are coming from and help people solve problems.”
It is virtually impossible to speak to Noble without her recommending multiple other experts to tap or read about, and they are almost always women of color. She’s known in the tech industry as being the ultimate connector, constantly uplifting others. And—as her son learned when he once asked her if they could just Google something for the answer—she’s a staunch advocate of doing your own research by reading books and seeking multiple viewpoints, not relying solely on technology to do the critical thinking for you.
Noble’s mission is not to stop people from using their iPhone. She’s advocating for large-scale policy and structural interventions. For example, passing laws to regulate AI, hold makers of harmful tech criminally liable, and direct fines into a public fund where people who are harmed can be compensated.
I want people to have a greater sense of personal power and strength about the greatness of who we are, rather than diminishing and flattening us to data points.
“We can demand that harmful and discriminatory AI be regulated or outlawed,” says Noble. “We can require 100 percent renewable energy standards on all AI given the incredibly destructive impact of AI on the environment. We have many points of intervention that have little to do with stopping an average person from using an AI prompt. That’s the wrong place to focus on shaping the future of AI.”
“I want people to have a greater sense of personal power and strength about the greatness of who we are, rather than diminishing and flattening us to data points,” says Noble. “My journey will always be informed by being connected to a community that is still suffering and is connected to other communities that suffer. Those are in some ways by design, and those are also things we can change and we must.”