When it comes to AI in hiring, there’s a disconnect between how employers and job seekers view the technology.
Indeed’s recent global AI survey found that 92% of US human resources and talent acquisition leaders are already using AI systems and tools in some way. Yet the majority of Americans (61%) are completely unaware employers use AI for hiring, according to a 2023 Pew Research Center survey.
While most US employers are optimistic about AI’s impact on the workplace over the next one-to-five years (60%), job seekers are less so (54%). In fact, job seekers are actually more fearful of the technology’s potential impacts (25%) than employers are (16%). When asked what most concerns them, 47% of job seekers said AI replacing human judgment and intuition in hiring decisions — and there have already been reports about problematic AI systems affecting applicants.
So, what can you do as an employer to reassure wary candidates that you haven’t outsourced your hiring to algorithms?
We turned to three experts: Indeed Vice President of Data Science Donal McMahon; Indeed Head of AI Innovation Hannah Calhoon; and Alan Walker, cofounder of Udder, a consulting business that helps talent leaders leverage AI and other technology.
Break down AI barriers
Generative AI is becoming more accessible. However, job seekers who are older, are low income, or haven’t received higher education are less likely to take advantage of such tools. This is significant to the hiring process because candidates who use AI to generate application materials can increase their chances of passing automated screening processes, giving them an advantage.
To help level the playing field for applicants who may not be comfortable or familiar with AI, start by clearly communicating how exactly they will encounter it in the hiring process.
“Transparency is always a great way to build trust,” Calhoon said. “Helping candidates understand where in the process their application will interact with AI tools is really useful.”
McMahon noted that transparency also gives you the opportunity to ask for feedback to improve, since there are bound to be missteps in incorporating any new systems like AI in hiring. “It gives us a chance to correct it the next time, and that creates a very powerful cycle so that the next [experience] and the next become better,” he said.
And, since these tools are created by humans, their screening is prone to the same systemic prejudices that influence human decision-making. In Indeed’s AI survey, 60% of job seekers expressed concern about bias in the data that trains employers’ AI hiring systems.
McMahon advised capturing and analyzing data in every stage of the hiring process to identify and avoid unintended bias. Indeed’s AI Principles, released by the Responsible AI team, can ensure fair and equitable processes when incorporating new technology.
“I encourage other companies to write down what matters to them and what’s going to guide their decisions — not only at a macro level, but in every tool they use and in every interview that they conduct,” McMahon said.
Walker cautions employers to build up AI processes slowly to help avoid magnifying incidentally biased behavior. “It’s critical to be hyper-careful when testing,” he said. “You can never predict what will happen with scaling to more candidates, but if you test enough candidates enough times and then scale slightly, you’re in a position to stop it quicker without it doing as much damage.”
Luckily, it can be easy to correct problems within automation. “Human mistakes may be for lots of different reasons, but something you’ve built is probably making the mistake for the same reason each time,” Walker said. “This means it may be easier to fix.”
Avoid the resume black hole
Many job seekers perceive AI tools as hyper-focused on keywords, ignoring a candidate’s full story. When applying to a job, they don’t know what algorithm or screening method an employer is using — or if it will automatically send their resume into the virtual abyss.
These fears aren’t unfounded: According to a Jobscan report, almost 99% of Fortune 500 companies filter candidates through applicant tracking systems, and these systems can be flawed. Their technology compares resumes against job descriptions and ranks candidates based on a staggering 70-80% keyword match, which can unnecessarily eliminate skilled candidates who may not use the “right” terminology on their resume.
In fact, 88% of executives know their screening tools reject qualified candidates, according to a 2021 Harvard Business School survey. What’s more, nearly half are aware that their ATS will automatically reject candidates with a resume gap of six months or longer, even though it may stem from military deployment, caregiving duties, or medical conditions.
In a tight labor market, organizations can’t afford to miss out on qualified candidates based on technicalities. When excessive keywords in job descriptions can disproportionately reduce an ATS-filtered candidate pool, consider only including the bare minimum necessary skills.
Calhoon emphasized the importance of building quality assurance into your AI-enhanced processes to avoid accidentally screening out viable talent. This can be as simple as reviewing an AI-generated email before you hit send or taking a more in-depth look at a candidate’s profile after reading a promising summary.
“Everyone wants to know their job application is being fully reviewed and that they’re being seen in the best possible light,” Calhoon said. “Even if you are dealing with an enormous load of candidates, and it makes sense to leverage automation and AI to streamline workflows, make sure there are moments in your hiring when folks get to engage with humans, ask questions, and present themselves.”
Don’t sacrifice the humanity of hiring
A looming concern among job seekers is that using AI in hiring will replace the personal touch. Calhoon said it’s important to remember that hiring is fundamentally human and should remain that way.
“These are critical decisions that impact people’s lives,” Calhoon said. “We are trying to superpower TAs and those in human resources, but we’re not trying to replace their really smart, thoughtful judgment with an algorithm.”
When you post a job on Indeed, AI recommends candidates that match employer requirements. This lets them choose the candidates who are the best fit and invite them to apply with a personalized message. In this process of using AI to streamline human decision-making — not replace it — candidates are 17 times more likely to apply for the job.
For many talent professionals, the challenge lies in striking the right balance between machine and human. Walker often finds the solution in experimentation.
He suggests testing various AI-powered approaches across your markets and surveying applicants about their experience to see what works. For example, try a high-touch human, low-touch tech approach in which communication like initial outreaches and interview follow-ups are personalized messages from a recruiter. Then, test the reverse where that communication is automated, and see what works for your candidate base.
“What we’ve discovered is that certain parts of the world are quite happy engaging with tech for most of the process,” Walker said. “Other parts of the world really push back once they realize a bot is involved. It is very dependent on the market.” He suggests embracing a more tech-heavy hiring approach, while offering human support to applicants if they run into problems or have concerns at any step in the process.
Inevitably, AI is transforming recruitment and business as we know it. While you ramp up your use of new technology, make sure you don’t leave quality candidates behind.
This post was created by Indeed with Insider Studios.