Two co-workers are sitting together looking at a tablet. One looks pensive, and the other's back is to the camera.

How to Finally Add AI to Your Hiring Tech Stack (Even if You’re Hesitant)

While AI has brought impactful changes across sourcing, hiring, and employee career development, HR departments are still holding out on adoption. Concerns surrounding budget amidst economic uncertainty and the risk of legal compliance with emerging AI laws are valid, but your organization can limit liability and increase the ROI on your investment with some preparation. 

To ease your doubts about AI, we compiled a list of five must-do tasks that HR departments should focus on when exploring AI hiring tools. You’ll break through the unknown of AI by knowing the right questions to ask vendors, and your organization can avoid getting caught in a sticky predicament with federal and state AI legalities. You’ll also create alignment over how to get the most out of AI so you can reach business objectives faster and gain an edge over your competition.

A perfect storm awaits: Why you need AI now 

Generally, organizations haven't adopted AI talent tools because they simply don’t have a full understanding of the technology. We empathize. AI is dynamic and imperfect, but this only means that no one completely understands it. Organizations cannot afford to sit on the sidelines any longer and let the unknown hold them back because changes in the talent landscape are happening now and they’re happening fast.  

According to Forrester, organizations should be preparing for:  

A resource crisis  

  • There soon will be 3 million more job openings than available workers.  

  • Recruiting teams have been hit the hardest in recent layoffs with 50% compared to only about 12% in other departments.  

  • Your workforce is at high risk of leaving for outside opportunities. Employee engagement is the lowest it's been in a decade.  

A technology arms race between candidates and recruiters 

  • Candidates are increasingly using AI to apply to hundreds of positions at a time. Talent teams will need to keep up with this influx as traditional screening methods become overwhelmed. 

  • Companies that are using AI recruiting capabilities are gaining a strong competitive advantage over companies that don’t.  

If you’re not prepared, your competitors absolutely will be, and they’ll be better equipped to handle the slew of talent challenges that lay ahead for organizations everywhere. 

Adding AI to your recruiting tech stack: What to do and what to ask 

All it takes to remove the fear of the unknown is to educate yourself. Here’s what we recommend doing to get your AI adoption plan off the ground confidently.  

1. Align AI use cases to organizational goals  

To set teams up for success with AI, HR leaders should determine potential use cases that align with business goals. For example, “How can AI create and manage a skills taxonomy for our team?” Get specific about how AI will change your recruiting and talent management process with the following: 

  • Map AI capabilities to your most pressing talent challenges  

  • Quantify AI’s potential impact on business objectives (e.g., reduce time to hire by 20%)  

  • Prioritize use cases that deliver impactful ROI with minimal compliance risk 

According to Forrester Analyst Betsy Summers, there are many low-effort, high-reward opportunities to capture with AI, including career pathing and performance feedback. The report also reveals that AI-ready HR teams who take this approach will accelerate their organization's overall AI success compared to competitors. 

2. Train users how to identify bias in AI and have a plan to maintain best practices

Identifying bias starts with us. AI can help curb bias when the system is trained with unbiased data, but it’s not foolproof: machines can make mistakes. Provide your recruiters with regular bias training and offer resources to help them reinforce these practices day to day:

  • Develop bias detection protocols that exceed regulatory requirements

  • Create accountability structures for AI-assisted decisions.

  • Implement continuous monitoring systems to catch potential issues early.

  • Understand how any AI tool you use explains its decision to users.

To elaborate on the last bullet point, AI shouldn’t tell you what decisions to make—it should recommend. Systems should be able to show you how it came to a decision. This ensures legal compliance, allows users to detect bias, and empowers informed decision-making. Explainable AI decisions also create a documented trail that can protect companies if their hiring decisions are challenged, while also helping organizations refine their hiring practices over time. This transparency is not just a best practice—it's increasingly becoming a legal requirement as AI regulations evolve.

3. Ask vendors how they stay compliant with emerging laws and regulations

Your AI vendors should be compliance partners, not risk factors. Lawmakers are hyper-focused on how AI is being used in employment, and laws and AI regulations will continue to move quickly at both state and federal levels. Learn how every vendor you want to work with stays compliant with these changes.

A few sample questions you could ask:

  • How does your team monitor and implement updates based on state laws, such as Illinois' AI Video Interview Act and California's automated decision systems regulations?

  • What's your typical timeline from when a law is passed to ensuring compliance?

Confirming your vendors are compliant at the start of your relationship keeps legal issues from surfacing and prevents the costly process of switching systems down the road.

4. Create messaging and strategy around notifying candidates when AI is being used

Many hiring laws and regulations require organizations to let candidates know when AI is involved in the hiring process, such as New York City’s Local Law 144 and Illinois’ AI Video Interview Act. While these examples are localized, you’ll have to consider them for any candidates you interview who live in these areas.

Your organization can leverage these laws and provide a first-class candidate experience against competitors with the following:

  • Develop candidate-friendly AI disclosure templates

  • Create educational materials that explain how AI improves the candidate experience

  • Implementing feedback loops to improve transparency

Over 71% of Americans are opposed to employers using AI when making final hiring decisions according to Pew Research Center. Use this as an opportunity to build trust with candidates and let them know that AI won’t be used in this way in the hiring process.

5. Ask vendors about their responsible AI program 

Vendors should have a governance system for their use of AI systems. A responsible AI pledge helps demonstrate that a vendor has thought deeply about how the technology they build impacts users and candidates while considering emerging laws and regulations. As an example, here’s SeekOut’s AI pledge. It includes seven core principles that we follow to ensure the responsible use of AI in our products and services.  

Vendors should be able to clearly answer questions that exemplify their commitment to keeping your organization safe from liability. A few sample questions you could ask:   

  • What data is being used to train the system?  Ask questions about the technology behind the tool and how the vendor gathers its data.  

  • How are they monitoring and auditing their data and algorithms? You’ll learn how these systems prevent bias and false outcomes. 

  • How is the system audited? External assessment is important to ensure there is no unintentional and harmful bias. Consider asking for any audit or materials related to bias assessment.  

These questions correlate to AI laws like the NYC ruling previously mentioned, which also requires employees not to use an automated employment decision tool if it hasn’t been audited for bias yearly.  

Overview of AI Regulations in HR and Recruitment 

This list is not intended to serve as legal advisement, but it provides a quick reference of two AI employment laws and regulations that impact organizations. Please note that because updates to these laws and regulations happen quickly, any changes may not be reflected here.   

New York City Local Law 144 (Automated Employment Decision Tools Law) 

Effective date: July 5, 2023 

Key requirements: 

  • Employers must conduct bias audits of AI tools used in hiring/promotion 

  • Results of bias audits must be published publicly 

  • Candidates must be notified when AI tools are used in assessments 

  • Candidates can request alternative evaluation methods 

  • Companies must disclose what job qualifications/characteristics are assessed 

Read more about Local Law 144

Illinois Artificial Intelligence Video Interview Act 

Effective date: January 1, 2020 

Key requirements:   

  • Employers must notify candidates if AI analyzes video interviews 

  • Must explain how the AI system works and what characteristics it evaluates 

  • Must obtain consent before using AI analysis 

  • Must limit who can access the video and delete it if requested 

Read more about the Artificial Intelligence Video Interview Act

You may also like

A group of coworkers sitting around a table together with their laptops and talking to each other.
BLOG

How Strategic Workforce Planning Drives Business Outcomes

Read the blog post →
Two co-workers smiling and looking at a tablet screen together
BLOG

How a Talent Marketplace Cuts Turnover to Enable Workforce Agility

Read the blog post →
An image of SeekOut's search function and sample candidate from the results
BLOG

Every Talent Pool in One Spot: Simplify Sourcing with Cross-Channel Search

Read the blog post →

Request a demo

Ready to take your talent optimization strategy to the next level? Request a demo to learn how SeekOut can help you recruit hard-to-find and diverse talent.

Request a demo
Illustration of person reading with abstract shapes flying above