Can Software Eliminate Hiring Bias?

April 26, 2018 Yael Grauer

How machine learning can help companies become more diverse.

GIF by Rachel Holland.

The lack of diversity amongst software engineers remains pervasive despite years of publicity, initiatives, and hand-wringing among leaders in the industry. Some people would write it off as a “pipeline problem.” Point to the hostile workplace cultures. Question massive corporate investments. No matter the laundry list of possible reasons for tech’s inclusion problem, one thing we do know is that progress is slow: tech companies employ a higher share of white — particularly male — software engineers and technicians than any other industry, according to the US Equal Employment Commission.

Waiting a generation to see results is unacceptable, not only for society at large, but also for a company’s bottom line. Some businesses are hoping that artificial intelligence (AI) will prove a more immediate fix. Throughout the hiring process, from recruitment to interviewing, to considering candidates, there are several biases that even individuals with the best of intentions might be subconsciously influenced by. If a computer can remain objective, the question becomes: how do you leverage its smarts and optimize for creating diversity?

“Alexa, Who Should We Hire?”

Many employers have turned to blind screening processes, and for good reason: when recruiting, agency Speak With a Geek provided 5,000 candidates to the same group of employers, only five percent selected for interviews were women — but when this same group of candidates’ demographic details were suppressed, that figure jumped to 54 percent.

But blind screenings only address some biases. A previous job an applicant worked at may affirm an interviewer’s hypothesis about the candidate. Or, an interviewer may intuitively think the applicant will be bored at the job based on his past experiences.

“Think about it this way: if a doctor wanted to know how much a patient weighed, would it be more effective to ask them their weight or put them on a scale?”
—Janelle Szary, pymetrics data scientist

‘This is why pymetrics aims to do away with resumes altogether. “They’re notoriously biased,” says pymetrics data scientist Janelle Szary. The company uses neuroscience-based games and AI to match candidates with companies based on the specific traits they exhibit and how they compare to the company’s existing staff.

A candidate is required to play a minimum of 12 computer-based games, which take approximately 30 minutes to complete and measure attributes like attention, altruism, risk-taking, memory and emotion perception. The applicant’s results are then compared to the company’s exemplary employees’ results — who’ve also played the same games — to identify applicants who match the traits proven to be valuable for success at the company.

“An applicant’s results cannot be ‘good’ or ‘bad,’” Szary said. “Our games are all trade-offs: you can be very fast, but then you’ll probably get some things wrong. Or you can be very accurate, but then you’ll probably be slower. Neither is ‘right’ or ‘wrong,’ they just reflect different cognitive dynamics.”

Pictured above is an assessment of equality across ethnic groups in a pymetrics modeling. This assessment shows the percentage of candidates from each demographic group that passed the model (as measured by ranking in the top 20 percentile). Pymetrics adjusts the model repeatedly, testing different combinations of traits, until the height of all bars (which are average pass-rates, using k-fold cross-validation, where brackets are +/- two standard deviations) is roughly similar, showing that the model pass-rate isn’t affected by ethnicity.

For example, a high score on attention may indicate focus, while a low score indicates flexibility and ability to pick up new or peripheral information. Supported by decades of neuroscience research these games “measure established building blocks of cognitive and emotional functioning,” said Szary. “Think about it this way: if a doctor wanted to know how much a patient weighed, would it be more effective to ask them their weight or put them on a scale?”

But having a machine churn through the ‘right’ data doesn’t necessarily eliminate biases. This is why pymetrics actively modifies each model they design for a client. Using semi-supervised machine learning to identify biases that arise from modeling for specific traits, pymetrics is able to assure that no particular race or gender is more likely to be a fit for a job than any other. The company accomplishes this by creating and refining test models and tallying the ratio of pass and fail rates of individuals from a sample set of 10,000 users who’ve provided demographic information. For each model, they use a suite of statistical tests, such as Bayes factors and Monte Carlo simulations, to determine whether any groups had significantly different pass rates — aiming to achieve roughly 20 percent pass rates for all as a step in refining the model until no statistical difference is noticeable across demographic groups.

Laura Gómez, CEO and founder of Atipica, ushers some concern for how AI is applied in addressing hiring, “We’re at an inflection point around the power of how we’re building our algorithms to combat things that we think are not equitable or equal, but at the same time we have to consider where the line is being drawn.”

For now, Gómez is drawing the line at surfacing patterns, leaving the ultimate decision for a human. With Atipica, companies are given a holistic picture of how their talent pipeline compares to national averages. The software aggregates and analyzes information from recruiting materials, resumes, applicant tracking software, sourcing CRMs, and so forth, and then compiles those data points alongside government data like labor statistics. Using a taxonomy of artificial intelligence tools such as natural language processing that helps the company predict gender and race, Atipica is able to compare a company’s talent pipeline of candidates to national trends across similar applicants and positions. The hope is by juxtaposing these different data sources, companies using Atipica can draw insights into their own potential biases for hiring.

Pictured above is Atipica’s “Market Insight” dashboard, showcasing an industry comparison of a company’s funnel segmented by gender & diversity.

“I honestly think there is enough information out there to understand the gaps in tech aren’t just a ‘pipeline issue,’” said Gómez. “We wanted to tackle all the numbers so it wasn’t just [people] focusing on the silo of computer majors from top ten schools and then people blaming it on the pipeline.”

Garrett Lord, Ben Christensen, and Scott Ringwelski understand the bias Gómez is referring to. The group met on a campus in the upper-peninsula of Michigan at Michigan Tech, a school lacking the prestige of its neighboring school, University of Michigan.

According to a LinkedIn report, 70 percent of people hired last year knew at least one person at a company they applied. For recent graduates entering the workforce, their school’s network can significantly impact the likelihood of an applicant applying to a job. Lord, Christensen, and Ringwelski co-founded Handshake to democratize job opportunities for all students — regardless of where they go to school, their major, or who they know.

Often, smart and talented students are left disadvantaged solely because of the lack of prestige of their university or proximity to a major city. Handshake is working to bridge this gap, connecting more than 500 universities with 250,000 employers to help expand recruiting pipelines across the country.

Machine learning is at the core of this recommendation system. Collaborative filtering allows the company to observe the application patterns of students and tailor job opportunities for them specifically. While there is a large amount of machine learning taking place, Handshake also does manual quality assurance to address misspellings and grouping of words that aren’t exactly the right fit.

Handshake assists students in the career search by tailoring recommendations based on specific interests and preferences. The company organizes recommendations based on users’ views, favorites, past applications in order to help students streamline the job search process. Machine learning is at the core of this recommendation system. Collaborative filtering allows the company to observe the application patterns of students and tailor job opportunities for them specifically. “We are taking the popular product recommendation principles from experiences like Netflix and Spotify and applying it to help students discover different career opportunities,” data scientist Jack Bradmiller-Feld explains.

Handshake also builds machine learning features that extract meaning from the text of job postings using natural language processing techniques. One such feature is the recently-introduced job search keyword suggestions, which guide students who don’t know exactly what to search for. The company has observed that students who take advantage of these suggested keyword searches successfully identified and applied to jobs on the platform at a five percent rate over their peers that did not.

While many companies are focused on hiring to address diversity issues, there is a lot of potential to incorporate technology to assess other areas of the problem. A study by the Kapor Center for Social Impact determined that toxic workplaces drive away women and people of color, often due to unfair behavior or treatment. Artificial intelligence could be used to analyze data around retention, transfers, raises and promotions, responses to human resources complaints, and other workplace concerns. Nonetheless, while these innovative applications are implementing ways to increase diversity in tech, it’s still up to organizations to address the issue head on.

Change is the only constant, so individuals, institutions, and businesses must be Built to Adapt. At Pivotal, we believe change should be expected, embraced, and incorporated continuously through development and innovation, because good software is never finished.


Can Software Eliminate Hiring Bias? was originally published in Built to Adapt on Medium, where people are continuing the conversation by highlighting and responding to this story.

Previous
How to Save a Fortune On Cloud Infrastructure
How to Save a Fortune On Cloud Infrastructure

How the Toolsmiths Team in Pivotal R&D cut over half a million dollars from its IaaS bill without impacting...

Next
How to Craft a Conversion Funnel
How to Craft a Conversion Funnel

Build products that grow their own user basesFunnel users from low- to high-barrier actions that have incre...