As an employer, you should be screening employees based on qualifications and preferences, not a candidate’s gender. This seems obvious, but even the most well-meaning employers and recruiters are subject to the curse of implicit bias.
Implicit bias comes into play when unconscious attitudes or stereotypes about someone’s gender, sex, race, ethnicity, age, religion or other identifying features are used to judge that individual’s competency. This is different from known biases, where a person is aware of any stereotypes they may believe, but may choose to not disclose their views.
Major universities including Harvard and Yale teamed up to create Project Implicit, a series of implicit-association tests (IAT) to detect implicit bias through a series of quick associations. Their popular Gender-Career IAT “often reveals a relative link between family and females and between career and males.”
The test has users pair pre-established names of men and women with family and career words. Test takers are prompted in one round to quickly match pre-categorized masculine names with words typically associated with family, while the next may have users pair feminine names with career words.
Based on hesitation and accuracy, users get an interpretation of their potential implicit biases. This comes into play with employee screening, where something as simple as seeing a name on a resume can influence an employer, even in the absence of known biases.
In a Skidmore University study, social psychologist Corrine Moss-Racusin created two identical, fictitious resumes for a lab manager position. The resumes only differed in name, with one fake applicant named Jennifer, the other John.
Different versions were sent out to STEM professors across the country for evaluation. Overall, the “Jennifer” resume received less interest, and was recommended a salary that was on average $4000 less than the identical “John” resume.
Implicit gendered bias was even present in women scientists who participated in reviewing the resumes. In the STEM field, women are underrepresented. Especially in tech, men are disproportionally hired over women.
So what can be done to level the playing field for gender when even a name could make employers think women candidates are less qualified?
Stop looking at names when initially researching a candidate. Okay, I know this is easier said than done and isn’t feasible if you’re screening through normal process of resume submission and in-person hiring events.
But if you use an online source, more platforms are offering solutions for fairer hiring practices that allow you to blind screen employees during initial rounds.
For example, job search site Woo offers anonymity for prospective employees, only revealing a candidate’s name and profile with their permission. During the initial pairing process, skills and background are shared, but other details are not available.
When setting up a talent profile, potential employees fill out a wish list, telling Woo about ideal opportunities, like higher salary, company culture, or desire to work with new technology. Likewise, employers set up their profile to reflect what their different positions can offer.
Using an AI algorithm, Woo calibrates employer with employee preferences to make relevant offers. During this step, user’s identities are hidden until they find an opportunity that matches preferences and actively choose to share their expanded profile with that company.
Woo even adjusts education and work history “so that it’s completely generic and less personal” to provide further identity cloaking. (Bonus: if you’re job hunting on the DL, Woo won’t pair you with current or past employers.)
This means employers can’t apply implicit or explicit bias based on name or profile information that may reveal personal details like gender or race.
Once a user chooses to share this information, employers are free to Google and social media hunt the prospective employee to their heart’s content.
Until then, talent benefits from being seen solely for their skills and experience. This can help level the playing field, especially in the tech industry, which is notoriously skewed towards hiring men.
Major companies like Lyft, Wix, and Microsft are already using Woo, and the service is available to employees in the United States and Israel.
Other job sites should consider scrubbing personal details like gender and name for initial searches and matches when showing results to employers. This can help eliminate bias based on gender and other personal factors.
If you’re seeking a job, you can use Woo for free. Employers can submit info to get contacted by Woo about joining up and staring a better, bias free recruitment process.