A grocery distribution company needed to hire warehouse workers where the job required lifting 50 lb. boxes, so they conducted a mechanical handle-pulling test on a pool of candidates. They gave the person who could lift the most weight beyond 50 pounds the highest score and ranked that person as “the most job-fit.”
The test for upper body strength was valid given the lifting tasks. The test lifting against mechanical resistance to simulate box lifting was a valid test. However, the implementation (scoring) that screened-in those who lifted the most weight was NOT valid because the job only required that the worker needed to lift 50 lb. boxes.
Moreover, such a test discriminated against female applicants because they were less likely to be able to lift as much as the male candidates in their pool, yet many were strong enough to meet the 50 lb. lifting requirement. When the grocery company was legally challenged, they claimed that the strongest candidates would also be the least likely to get injured. This seemed like a plausible belief, however, they had no proof that upper body strength correlated with injury avoidance at their site…so the assessment process was deemed not valid.
This research is backed up by studies that show that bias in the recruitment process exists in terms of age, race, and gender. This is reflected back in GetApp’s research, with more women than men believing that most companies’ recruitment processes are biased: 37% of females said they believe hiring processes are unfair, compared to 32% of males.
Unfortunately, such examples of discrimination in the workplace are common in the U.S labor market. Despite the well-documented benefits of diversity in the workforce, US companies – especially in the tech industry – are still falling short. This lack of diversity highlights a bias in the recruiting process. According to recent research from GetApp, more than a third of people think that most companies’ hiring processes are biased and unfair. Meanwhile, almost a quarter of the survey respondents said that all companies have unfair recruiting practices.
A technique used by minorities in order to get an interview is to scrub away language that might reveal their race and alter their foreign-sounding name to something that sounds more American.
A research paper published in the Administrative Science Quarterly Journal describes a study in which 1,600 fabricated résumés were sent to 16 different areas in the U.S. Some names were left unchanged, whereas some were “whitened.”
The results were stunning: 25.5% of résumés received callbacks if African-American candidates’ names were “whitened”, only 10% received a call back if they left their name and experience unaltered. For Asian applicants, 21% heard back if they changed their resume, and only 11.5% of candidates did if their résumés were not “whitened”. The research also found that “job applicants with white names needed to send about 10 resumes to get one callback; those with African-American names needed to send around 15 résumés to get one callback.” If we are referring to the tech industry, according to Gartner research, only 14% of CIOs are female, while only 2% of executives are black and 3% are Latino.
The Civil Rights Act of 1964: Title VII (Equal Employment Opportunities) prohibits employment discrimination based on race, color, religion, sex, and national origin. Therefore, for every job they post, employers are responsible for choosing or configuring assessments to be valid measures of candidates’ skills.
However, bias exists on both a conscious and unconscious level, and can also arise due to the challenges and constraints of current recruiting processes. For example, if hiring managers or external recruiters are solely judged on the speed at which they can find suitable candidates, then they will often fall back on old qualification indicators, such as which school a potential employee attended.
HR and skill assessment software providers have made headway into tackling this issue, with blind auditions and skills-based tests taking the place of traditional hiring processes. Machine learning-based algorithms are enabling companies to make hiring decisions based on current demonstrable skills for the job, rather than specific qualifications such as a degree from an Ivy League school or a certain economic background.
Also, in creating a selection process, you must make sure it is valid and in compliance with EEOC (U.S. Equal Employment Opportunity Commission) guidelines. If an employer in good faith tries to follow EEOC recommendations and demonstrably customizes their selection assessments to fit the job requirement, the employer will have a robust defense against adverse impact and other legal challenges. You can read more here about how conducting a complete job analysis can help you avoid biased hiring decisions.
As HR and recruiting professionals, we have a responsibility to lessen and eliminate as much bias from the hiring process as possible by creating a level field for all candidates. Using an assessment that you’ve demonstrably tailored to fit a job’s requirements, per a job analysis, is more compliant than using an off-the-shelf test or no test at all.
What are some of your strategies for decreasing the level of hiring discrimination in your company?