The EU AI Act Is Already in Your Hiring Pipeline

Here's a question worth sitting with for a moment: is your technical recruitment team using any AI tools right now? An applicant tracking system that filters CVs? A platform that scores candidates? A note-taking bot that summarises interviews? If the answer to any of those is yes, and you have candidates or employees in the EU, you are already operating in territory the EU AI Act cares about.

I say this not to alarm, but because there is a persistent assumption out there that the AI Act is a concern for companies building AI products, not for companies simply using them. That assumption is worth revisiting before it becomes expensive.

What the Act Actually Says About Hiring

The EU AI Act explicitly lists AI systems used for the recruitment or selection of individuals, including those that analyse and filter job applications and evaluate candidates, as high-risk AI systems. That is not a grey area. An AI system that ranks CVs, filters applications against criteria, or scores candidates on any dimension is captured under Annex III of the Act. Your ATS with a smart filter? Captured. AI-assisted interview notes that feed into a hiring decision? Likely captured. A coding assessment platform with AI scoring components? Worth a very close look.

Using AI for emotion recognition in candidate interviews or video assessments is outright forbidden under the Act, as of February 2025. If you have been using tools that claim to read candidate sentiment or personality from video, those need to stop.

The Deadline You Should Be Working Toward

The core requirements for high-risk AI systems, including documentation, human oversight, and audits, become enforceable on August 2, 2026. That sounds like a comfortable distance away, but the due diligence required to get there is not trivial, and starting now is the sensible approach.

The extraterritorial reach of the Act means that a company can have EU AI Act obligations without any physical EU presence, if AI outputs are used to recruit EU candidates or evaluate EU-based workers. If your global hiring platform touches EU candidates, you are in scope.

What Due Diligence Looks Like Right Now

For companies deploying AI in recruitment, the obligations that need to be worked toward include a few key areas. Employers must inform candidates about the use of high-risk AI in recruitment, including how the AI functions and how decisions are made. Individuals have the right to request explanations about the role of AI in decisions affecting them.

Beyond transparency, human oversight must be established by individuals with appropriate competence and authority, with the genuine ability to intervene and override outputs where necessary. AI-generated logs must be maintained for an appropriate period, with at least a six-month minimum baseline.

Employers must also continuously monitor the operation of high-risk AI systems, identify any risks that materialise during use, and ensure that staff using these systems have adequate AI literacy. That last point is already in force. AI literacy obligations for providers and deployers became applicable from February 2025.

A practical starting point is an AI inventory for your hiring process: what tools are in use, what decisions they touch, and whether those tools fall under the high-risk category. From there, the work involves reviewing vendor documentation, checking whether your contracts with AI tool providers address compliance obligations, and putting transparency notices in place for candidates.

This is exactly the kind of work that benefits from having someone who understands both the regulatory intent and how it translates into your specific technical stack. If you would like to work through what this looks like for your recruitment process and tools, feel free to get in touch. I am also available to walk your HR and technical teams through the Act's implications in a focused session, which tends to be a good way to get everyone on the same page quickly.

Next
Next

You Don't Need a Perfect Privacy Program. You Need to Start One.