Did you know that up until recently, every NFL player had to take a 50 question pre-employment test called the Wonderlic in order to play? A quarterback takes the same cognitive test to land a job that a bus driver or a corporate job seeker still takes to land a job.

For many years, Wonderlic’s results were secret, but whenever the data has been made public, the scores have provided a much clearer picture of racial bias.

Just how bad is it? The average on Wonderlic for the general population is 21. In one study, black NFL draft picks averaged 19.8, compared to 27.7 for whites. He was also the only player to score less than 18. All Black, while only those scoring above 30 were. All white.

According to NFL vice president of operations Troy Vincent, the results of a “combined audit of all assessments” led the league to stop administering the test in 2022 because, “frankly, it’s been an outdated process.”

The legal term for discrimination due to biased hiring tests is called disparate impact. This occurs when an employment practice appears to be neutral, but actually has negative effects, such as on people of color and women. Some form of employment testing has been used to maintain segregation in employee, military, and college settings for nearly 100 years.

As a local example, in 2012, New York City was sued for $128 million over the fire department’s “neutral” hiring practices. Between 1999 and 2006, the FDNY used a written exam that disadvantaged black and Hispanic candidates to screen entry-level firefighters. After an investigation, it became clear that the test had nothing to do with a person’s firefighting abilities, meaning the department had potentially denied jobs to thousands of qualified minority candidates. It wasn’t the first time the FDNY turned a blind eye to racial bias in hiring tools. The court called the incident “part of a pattern, practice and policy of intentional discrimination against black applicants.”

NYC Local Law 144, passed in December 2021, has two key requirements at the heart of disparate impact. First, all employers should conduct bias audits on any employment tests or tools they use. Second, employers must publicly disclose audit results. For the first time in history, employers must disclose information about the different effects on any tests they use to screen applicants, whether they create them internally or purchase them from vendors.

For decades, my research career has been dedicated to reducing discrimination caused by biased employment tests. I try to help people understand how bias occurs in testing. My enthusiasm for this law stems from the fact that it will bring transparency to a much wider range of automated tools, including tests like Wonderlic.

Before it takes effect in January, the Adams administration must issue official rules to clarify the law’s implementation. This should be straightforward, especially since adverse effect is an established legal concept.

Unfortunately, the city’s business community wants to create loopholes to avoid transparency. If they can narrow the definition of “automated employment decision tools” so that it only applies to sophisticated computer applications, they can continue to use some of the most biased paper-and-pencil cognitive tests. This implementation would be like forcing electric cars to report on carbon emissions, but not gas guzzlers. All automated tools need to be audited no matter how high-tech or low-tech they are.

Despite how worried some employers are, they are not new to bias audits. Since the civil rights era, federal regulations have required many organizations to collect the same data that the new NYC law highlights. But many of these organizations are also very attached to their current employment practices and do not want to face pressure to abandon them.

In a classic case of blaming the victim, some proponents of traditional testing will even ask why test takers of color don’t complain about differential effects..

The answer is obvious: individual test takers do not have access to relevant information. They don’t know how many applicants from different ethnic groups applied for a given job. There is currently no public information on the extent of bias in employment practices. You just don’t know what you don’t know.

Some automated job evaluations are creating racial classifications in the workplace. The only way we’ll know for sure is to put these numbers in the public eye. All of these new local laws require employers to make current reports available to the public. so easy.

New York City is close to opening the door to a fair and transparent process for selecting a pool of qualified applicants. The business community, creators of the city’s labor ecosystem, owes it to New Yorkers to provide a diverse and competent environment that reflects the customers, clients and audiences they serve. We need to take it, not give in.

Helms is an Augustus Long Professor Emeritus in the Department of Counseling, Developmental and Educational Psychology and director of the Institute for the Study and Promotion of Race and Culture at Boston College.


#Fighting #employment #discrimination #honest #Denver #Post

Source link