VeltrixVeltrix.
← All articles
46 / 62April 1, 2026

AI in HR and Hiring: How AI Is Changing Recruitment and People Management in 2026

AI in HR and hiring — how AI tools screen CVs, predict attrition, personalise onboarding, and the bias risks that come with algorithmic recruitment.

Industry / HR

AI in HR and Hiring

How AI is screening 500 CVs in minutes, predicting which employees will leave before they resign, and personalising onboarding — and the bias problems that follow.

67%
Of HR professionals now use AI tools for some hiring tasks — up from 28% in 2022 [SHRM]
50%
Reduction in time-to-hire achieved by companies using AI screening and scheduling tools [LinkedIn Talent]
$215K
Average cost of replacing a senior employee — making retention prediction AI a compelling investment even at enterprise software prices [SHRM]
CV screening and ranking
AI screens CVs against job requirements, ranking candidates by match. Processes 500 CVs in the time a human screens 20. HireVue and Workday AI are market leaders.
HireVue, Workday AI, Lever, Greenhouse AI
Interview scheduling
AI assistants (Calendly AI, Clara) handle scheduling back-and-forth, integrating with interviewer calendars and sending confirmations automatically. Saves 30-60 minutes per hire.
Calendly AI, Clara, ModernHire, HireEZ
AI video interviews
Async video interview platforms where candidates record answers to set questions. AI analyses responses for relevant keywords, communication clarity, and structured completeness — not facial expressions, which has been widely criticised.
HireVue, Spark Hire, VidCruiter
Attrition prediction
AI analyses engagement survey data, manager feedback, performance patterns, and behavioural signals to predict which employees are at flight risk 6-12 months before they leave. IBM's attrition model is reportedly 95% accurate.
IBM Watson Talent, Visier, Lattice, Workday
Personalised onboarding
AI onboarding platforms adapt content to the new hire's role, background, and learning style. Answer common questions 24/7 via chatbot. Microsoft's onboarding AI reduced time-to-productivity by 23% for new managers.
Enboarder, BambooHR AI, Sapling, Rippling
Performance management
AI helps managers write better performance reviews (flagging vague language, potential bias in word choice), suggests development goals based on career paths, and identifies skills gaps across teams.
Lattice AI, Culture Amp, 15Five, Betterworks
Amazon's failed hiring AI (2018)
Amazon scrapped an AI hiring tool after discovering it systematically downrated CVs from women. The model had learned from 10 years of hiring data — in which successful hires were predominantly male. It penalised CVs that mentioned "women's" (e.g. "women's chess club") and downrated graduates of all-women's colleges.
Proxy discrimination
AI hiring tools that are explicitly prohibited from using protected characteristics (race, gender, age) can still discriminate indirectly through proxy variables — ZIP codes that correlate with race, graduation years that reveal age, or activity patterns that correlate with gender.
Facial analysis in interviews
AI tools that analyse facial expressions, vocal tone, or non-verbal behaviour in video interviews have been criticised by researchers as pseudoscientific — no evidence they predict job performance, and high risk of disability discrimination (e.g. candidates with autism or Parkinson's).
EU AI Act requirements
AI systems used in hiring are classified as "high-risk" under the EU AI Act, requiring human oversight, accuracy monitoring, bias testing, and transparency to applicants about AI use. Enforcement begins August 2026.
The practical guidance
Use AI for administrative efficiency (scheduling, initial screening, document processing) where the time savings are clear and bias risk is manageable with oversight. Be very cautious about AI decision-making tools that score candidate "suitability" or analyse personality from video — the evidence for validity is weak and the discrimination liability is real. Always maintain human review for hiring decisions, especially for the shortlist and selection stages.
Is AI hiring screening legal?
In most jurisdictions: yes, with caveats. EU AI Act classifies AI recruitment tools as high-risk from August 2026 — requiring transparency, bias auditing, and human oversight. New York City Local Law 144 requires bias audits and candidate notification for any AI hiring tool. Australia, UK, and US EEOC guidance all require that AI screening tools don't produce disparate impact against protected groups. Employers are responsible for the outcomes of AI tools they deploy — "the AI decided" is not a defence.
How do I know if a company uses AI to screen my CV?
You often won't — and many companies don't disclose it. Best practices for AI-screened CVs: use standard section headings (Work Experience, Education), include keywords directly from the job description, avoid tables and graphics (AI parsers misread them), and submit as a clean PDF or Word doc. ATS (Applicant Tracking System) optimisation and AI optimisation have become the same discipline.

Get AI insights every week

The AI Briefing covers what actually matters in AI — no hype, no jargon, just what you need to stay ahead.

Subscribe free
Written by Luke Madden, founder of Veltrix Collective. Data synthesis and analysis by Vel.