Artificial intelligence, hiring and the law

Hire the AI software. Fire the regulations designed to limit it.
Geoff Williams
NRF Contributor

Alexa and Suri notwithstanding, the concept of artificial intelligence can creep anyone out. It might not be a surprise that some city governments are not only unnerved by it, they’re regulating it.

Some government officials are understandably worried about artificial intelligence programs taking away jobs — but lately, some municipalities appear to be concerned that AI is being used to help people get jobs.

For instance, New York City and the District of Columbia are among locales that are enacting or considering laws to restrict how employers utilize artificial intelligence programs in hiring and promoting decisions.

If you’re unaware of what is transpiring in the world of human resources, AI and city governments, here’s what is at stake.

Innovative technology

Learn more about the latest advancements in technology and retail.

How AI can help with hiring

Increasingly, recruiters and human resources departments have been using AI tools to help find job candidates by performing repetitive and time-consuming tasks like analyzing resumes, arranging interviews with job candidates, and scheduling job assessments.

While important, these responsibilities amount to little more than busy work. Employers can use software programs to complete these tasks, freeing up staff to do work that takes deep thinking.

What city governments fear

It’s simplistic and a little unfair to just say that some government officials are concerned that AI is being used to help people get jobs. The worry that city officials have is that AI programs might be used, intentionally or not, to discriminate against some job candidates.

In New York City, starting January 2, 2023, employers will be required to conduct an independent audit of the automated tools they use, to ensure that AI programs aren’t somehow discriminating against job candidates.

Late last year, Washington, D.C.’s attorney general announced proposed legislation that would look at “algorithmic discrimination” and oblige companies to submit annual audits about their technology.

Careers and leadership

Take a look at the latest news in career development within the retail industry.

What retailers fear

Making sure that AI doesn’t discriminate against job candidates is obviously a worthy and admirable goal. No decent business owner wants to discriminate against any deserving potential employee, and while the mission — eliminating bias — is noble, the laws that have so far been put forward by cities like New York and Washington, D.C., are confusing, murky and potentially expensive.

“Our members use AI tools judiciously as part of their hiring process to screen, assess and select job candidates. No one wants to discriminate against an applicant. Nor do employers use these tools in an attempt to shield themselves from allegations of discrimination. NRF members use AI tools to separate applicants who aren’t qualified from those who are qualified. If we are hiring a lawyer, we want them to have a J.D. If we’re hiring a web designer, you want them to have certain experiences that make them qualified to do web design,” says Edwin Egee, vice president of government relations and workforce development at the National Retail Federation.

Many retailers and their advisors, such as employment attorneys, are concerned that regulation of AI hiring practices is simply going to hurt businesses and their employees rather than help them. Retailers worry that city regulations against AI will make the cost of hiring people even more onerous.

That’s the last thing any business needs in these times of inflation and labor shortages. If you’re using AI to make your job search more efficient and cost-effective, but you also have to hire a third party to conduct regular bias audits of these tools, suddenly the job search tool isn’t more efficient and cost-effective.

If such policies are enacted, the beneficiaries will not be the job applicants — but rather the third-party companies hired to do these audits.

After all, while it may be possible for AI programs to discriminate against job candidates, it seems a leap to suggest that humans doing these assessment and recruiting tasks would be any better.

It isn’t as if humans have never been accused of discrimination when conducting a job search. In fact, there’s a solid argument that AI, because the emotional component and inherent biases are removed from these algorithms, will be far better at choosing job candidates fairly than human beings.

“My viewpoint is that, happily, employment discrimination is increasingly rare,” Egee says. “To the contrary, the vast majority of employers are eager to diversify their workforce. Moreover, we’re in the midst of an unprecedented workforce shortage during which employers are desperate to find employees. As such, most business owners couldn’t be less interested in finding new methods to disqualify qualified candidates from a job.”

Related content

EEOC General Counsel says personal experience led her to career fighting discrimination
 
A shopping cart and gavel symbolizing retail law.
Karla Gilbride joined legal experts at NRF Retail Law Summit to discuss the unique legal challenges for retailers.
Read more
Improving the front-line associate experience at PetSmart
 
PetSmart's Theresa Lee speaking on the Retail Get's Real podcast.
Retail Gets Real episode 338: HR talent leader Theresa Lee on culture, communication development and a growth mindset.
Read more
Future trends that will shape consumer behavior and retail operations
 
Lee Peterson speaking at NRF 2024: Retail's Big Show
NRF 2024: Leaders of GDR Creative Intelligence, WGSN and WD Partners on technology's impact and the role of stores.
Read more