Friend or Foe: AI in the Recruitment Process

AI in the Recruitment Process

As recruitment becomes increasingly complex and competitive, employers need new modes of differentiating candidates and candidates need better methods of evaluating their abilities. Artificial intelligence (AI) is being introduced into recruitment processes to save time, narrow down the number of applicants, and we hope, eliminate unconscious biases in the recruitment process. So, let’s take a deeper look into ai in the recruitment process.

It’s expected that AI recruiting tools will become mainstream by 2030, however, recently AI has developed a justifiably poor reputation. Concerns about reproducing cultural biases, as opposed to eliminating them, are rising.

So where does this leave HR? We would like to assume that AI would enable a company to hire a person solely based on skills that are needed for the job, making it a merit-based system. The algorithms used to assess candidates are based on quantifiable data and can be purposefully built to discard prejudices. However, it’s important to consider who is designing these products and for whom they will benefit.

The irony is that while the tech industry has large gender disparity problems, technology also seems to be a way to combat unconscious bias, the gender gap and equalise opportunities for minorities. The use of these systems can only increase over time. This means it’s important to explore both the potential positive and negative outcomes of introducing AI recruiting tools.

AI in the recruitment process for candidates

People undoubtedly connect with those that look or sound like them, have the same interests and come from similar backgrounds. It’s no surprise therefore that employers tend to hire people using a similar process. The use of AI can become such an easy method of avoiding such predispositions. New technologies will identify the right candidates based on a number of characteristics that the job itself requires. Factors that unconsciously influence the decision-making of the employer, such as gender, race and class, could be entirely ignored. Instead, candidates could be judged first and foremost on their skills and abilities to problem-solve or handle stress.

However, we must also think about how these systems can be seriously flawed in their design. To think that we can create something that is totally objective is simply naïve; but more seriously, ignoring the risk of creating products that favours some candidates over others based on their gender, ethnicity or accent is severe. PwC’s recent study found that women hold only 5% of senior positions in the tech industry – a disturbing figure considering the products created tend to benefit those creating them.

As entrepreneur, author and keynote speaker, Maragaret Heffernan said, ‘the tools we create reflect those that created them’. The smart screening verifications may in fact create disadvantages for people with particular socioeconomic backgrounds. Amazon’s recent attempt at creating a recruiting engine in 2014 showed exactly this. We need to take this conversation to the developers for them to show that they’re taking this responsibility seriously. Showing that they are doing everything they can to build the diversity and inclusion agenda into the design.


AI in the recruitment process for the company

AI in hiring is an inexpensive solution for recruiting candidates for a position. It may not find the employee that a company will hire, but it reduces the mount of CVs for recruiters.

Recruiting engines will also be used to diversify a workforce. The importance of creating a diverse workforce in this age is essential to a company’s productivity and success. Importantly, a diverse board of directors or managers generates more varied and original thinking. This leads to greater creativity and innovation in the company. A wide range of perspectives allows issues to be approached from multiple directions, reaching finer solutions.


Reflections for the future

It seems that the only way to equalise the system is if the engineers confront their own biases and cultural privileges. We must consider who creates the algorithms that will judge which candidate has a higher value than another. While AI may be a way to avoid bias, it also could sustain biases and reproduce structural barriers for various groups of people.

Companies are increasingly choosing to hire external keynote speakers to guide and share their opinions on the future of recruitment. HR is traditionally human domain, positioned around the cultural elements that make up a company. These include workers rights, advocacy and building a relationship with the employee. Sociological and technological experts should be sharing their views on the transition from a human sector to a technological one and its impact on society, the workplace and the workforce.

About the Author

Flora Meadow Ai in the recruitment processFlora is a freelance writer, holds an MA in Media and Communications from LSE and works in the entertainment industry in London. Flora’s interests include digital anthropology, photojournalism, the politics of documentary filmmaking, and the mediation of memory and ideology.