At the inaugural Leadership Connect Global Summit, Indeed CEO Chris Hyams and Dr. Safiya Noble discussed the importance of ethically using AI in hiring.

Key Takeaways

  • Dr. Noble has spent years researching how existing systemic biases influence algorithm-based technology, which includes AI-powered recruitment tools.
  • Overreliance on technology made to detect bias, like job description bias scanners, isn't as effective as human intervention; educate your recruiters on how to test AI tools and vet the results.
  • To promote equitable hiring, employers can normalize the discussion of responsible AI with their teams, implement a skills-first hiring approach and reward DEI efforts in performance evaluations.

“A lot of people think that when they're building AI and they're building algorithms, they're just doing statistics. And statistics can't be racist because it's just math,” said Dr. Safiya Noble, a leading interdisciplinary researcher on the internet and it's impact on society, as she addressed the nearly 100 talent executives representing 10 unique countries at Indeed’s inaugural Leadership Connect Global Summit. “It's a little bit like saying to be human is just to be mitochondria and cells. I mean, that's true. But that's not the whole picture by any stretch.”

Dr. Noble sitting down in a tan, upholstered chair. She is wearing a tan sweater, black pants, and sneakers while she speaks to an audience and gestering with her hands.
Dr. Safiya Noble shared insights about how existing systemic biases influence algorithm-based technology like AI. She believes that the intersection of bias, technology and hiring is “a life-or-death kind of issue in our society.”

Dr. Noble — UCLA professor and Faculty Director of its Center on Race and Digital Justice, Co-Director of the Minderoo Initiative on Technology and Power and best-selling author of "Algorithms of Oppression: How Search Engines Reinforce Racism" — joined Indeed CEO Chris Hyams for a candid conversation about the promises and challenges of AI in the world of hiring, which was the topic of many Indeed FutureWorks 2023 presentations the following day. 

“This event really is a call to action. It is a recognition of our collective opportunity and responsibility to help shape the future,” said Hyams, who interviewed Dr. Noble in an episode of the Indeed podcast series Here to Help.

​​“For you all as employers, think about the way you use search to validate, look into or investigate people when these technologies are grotesquely misrepresentative,” said Dr. Noble. “Having a job, being able to take care of oneself — I see that as a life-or-death kind of issue in our society.”

From defining “oppressive algorithms” to gauging the usefulness of tools like AI-powered bias scanners, we rounded up highlights from her discussion with Hyams that explored concerns about the rise of AI — and what to do about them.

Indeed Leadership Connect brings together top leaders in HR and talent attraction (TA) from across the world to discuss their current hiring challenges, share insights and strategize solutions for helping all workers thrive. If you’re an HR or TA leader at the VP level or above, learn more and apply to join the Indeed Leadership Connect community.

What Is Algorithmic Oppression, and How Does It Impact Hiring?

To ground the discussion, Hyams first asked Dr. Noble to define exactly what she means by “algorithms of oppression” in her research.

“We live in power systems — systems that are overdetermined by histories of racism, of sexism,” said Dr. Noble. “These instructions that we give to computers — which is really what an algorithm is — they are connected to power systems. And what Cathy O'Neill, my colleague who wrote the great book ‘Weapons of Math Destruction,’ says is algorithms help people who are already doing well do better and people who aren't doing well do worse.”

Dr. Noble recounted what first spurred her to study how the algorithms deciding search engine results are both informed by and uphold existing systemic biases. As a marketer before the boom of search engine optimization (SEO), she “gamed” search engines to help her clients appear on the first page of results. While many people at the time talked about search engines like public libraries, she viewed them as advertising-driven technologies.

“I started thinking about what happens when you want to make knowledge about people accessible and available, and they are reliant upon a search engine to give it to them,” she said. Using U.S. census data of racial, ethnic and gender categories, she ran multiple searches to test it out. “What we got back was stunning,” she said. “When you searched on Black girls, Latino girls, Asian girls, it was almost exclusively pornography that came back in the results. Now you didn't have to add the word ‘sex,’ you didn't have to add the word ‘porn.’ Girls of color, women of color, were synonymous with pornography.”

“That's what I think of when I think of algorithms of oppression: It's the way these technologies work in systems of power that have profoundly consequential effects for individuals and for communities,” she said. 

This problem occurs in the hiring process when employers attempt to research the backgrounds of their candidates. “We know that, not only are search technologies used to vet people, but that these are often erroneous,” said Dr. Noble. As an example, she described the experience of Latanya Sweeney, a renowned Black computer science professor who has served as Chief Technologist at the U.S. Federal Trade Commission (FTC). When she was interviewing for a position at Harvard, the interviewer asked Sweeney about her (nonexistent) criminal past because searching her name served up ads for criminal background checks.

“So she did this very famous study where she took over 1,000 African-American-sounding names like her own, Latanya, and she ran them through search to see what would happen,” Dr. Noble said. “Overwhelmingly, if you had a Black-sounding name, the ads that came up and how you were framed was around criminal background checks, mugshot databases and otherwise.”

“It shows us that the kinds of words that get associated with people and ideas, whether it's their names or gender markers, racial markers, class markers, the part of the country that you live in, your zip code — all of these things are really built into so many of these technologies,” she said.

Hyams advocated that talent professionals educate themselves on potential negative effects of AI tools — like what Dr. Noble’s research uncovers — before using them to hire. “You can't build big machinery without having training and backup systems and all those kinds of things to make sure that they're being used safely,” he said. “What we're advocating for is understanding what the challenges are [with AI in hiring] so we can think about how to be as safe as possible.”

Chris Hyams stands on a stage with two hairs on his side and a dark blue background behind him while he speaks to a small audience. He is wearing a black shirt, black blazer, and jeans.
“What we're advocating for is understanding what the challenges are [with AI in hiring] so we can think about how to be as safe as possible,” said Indeed CEO Chris Hyams.

How Effective Are Automated Bias Scanners and AI-powered Recruitment Tools?

Many people in talent acquisition use bias-scanning tools to check content like job descriptions for exclusionary language. Hyams asked the crowd to raise their hands if they used such tools, and there was a smattering of agreement across the room. While acknowledging that these tools are a step in the right direction, Dr. Noble said that a recruiter’s judgment is still the most valuable guard against bias. “We still have to really be involved in constantly articulating our values, who we’re looking for [and] the value of diverse teams,” she said.

Dr. Noble recalled her experience in multicultural advertising and marketing because she often had to tell clients to be explicit when inviting African Americans to an event. “Job ads are a similar kind of thing," she explained, "where people, unless they explicitly see themselves in them, are not going to apply. Or they're going to presume that you're looking for the folks from those top five universities. People self-select out based on the way that we write job ads.”

Hyams cautioned against becoming overly reliant on technology in the hiring process. “Even if there was some magical technology that could counteract all the problems with other technology, you wouldn't want to just have that happen on its own," he said. "You want teams to be aware of and thinking about these things."

For those using AI tools to narrow down applicant pools, Dr. Noble urged employers to look at all resumes — not just those that an AI filter picks out. “AI is the ultimate standardization; it is the ultimate classification tool,” she said. “It doesn't allow us to see the fullness and it doesn't help us understand where people come from, what they've been through.”

Hyams emphasized using AI to augment your hiring decision-making, not replace it. Indeed’s AI-powered Matched Candidates tool recommends candidates whose profiles match your job descriptions, but it is still up to the recruiter to select the best fit. With this combination of AI and human matching, candidates are 17 times more likely to apply for the job.

As part of Indeed’s commitment to help 30 million job seekers facing barriers get hired by 2023, Indeed has worked to promote hiring based on skill, not credentials. “We like to say talent is universal, opportunity is not,” said Hyams. “So if you believe that opportunity is not evenly distributed and you're looking at job titles as a basis for someone capable of doing this job, you’re guaranteed to perpetuate those inequities.”

Dr. Noble said, “There's no replacement for a gut check, talking to someone and hearing their story. Those stories are really powerful and can inform and make a lot of organizations better.”

There's no replacement for a gut check, talking to someone and hearing their story. Those stories are really powerful and can inform and make a lot of organizations better.

Dr. Safiya Noble

How Do We Combat Algorithmic Oppression?

Hyams asked Dr. Noble how she and her team, on the cutting edge of identifying these issues, recommend combating them. “The thing that I think about often is, of course, who is building the technologies,” said Dr. Noble. “We know that women and people of color — especially Black, Latino and Indigenous students — are massively underrepresented in engineering programs. They're not going to pipeline into the top tech companies.”

She pointed out that those with degrees in humanities and social sciences are usually working in fields like marketing and public relations, but their education is just as valuable in technical areas. She recommended using diverse teams and knowledge sets when developing products or systems. “We should be hiring alongside those folks while we're dealing with these pipeline issues — PhDs of Black studies, gender studies, Indigenous studies — and give them the kind of power that you give the PhD in computer science,” she said.

How Do We Get Started?

During the Q&A portion of the discussion, one attendee asked how to get started making change at their company. Dr. Noble emphasized the importance of checking all hiring decisions through the lens of how they may impact different populations, as well as holding vendors responsible. “Ask your vendors to be accountable to you and explain to you, what are the data systems that this AI is trained on? Where do the data sets come from? How can we modify them? How is it learning?” she said.

It’s really a million little decisions every day that make change.

Dr. Safiya Noble

Staying engaged in the discussion also helps normalize it with your teams and leaders. “In every core operation of a company, these conversations could be happening specific to that area, whether it's HR, supply chain management or other kinds of services,” she said. She encouraged talent leaders to consider adopting responsible AI as an environmental, social and governance (ESG) commitment and hiring an internal team that can focus on this issue every day in all areas of an organization.

Employers can empower their teams by celebrating different skill sets and ESG efforts. “One thing that we do in higher education now is … our commitments and our work around diversity and inclusion get rewarded and evaluated as part of our performance review,” Dr. Noble said. “[That is] shifting the way in which we prioritize and value different kinds of work.”

She recognized that it can be difficult to make these changes because they require an entire shift in company culture, but leaders don’t need to come up with one grand action plan. Rather, “it’s really a million little decisions every day that make change,” she said.

Indeed CEO Chris Hyams previously interviewed Dr. Safiya Noble in an episode of the Indeed podcast series Here to Help.

Learn more about Indeed Leadership Connect, which offers a variety of virtual and in-person events throughout the year. To stay informed and educate your team about algorithmic bias, check out the reading list Dr. Safiya Noble curated that, in her words, “will make you the most interesting person at a cocktail party.” And keep up with her ongoing work at safiyaunoble.com.