Locked out by the algorithm: Why AI recruitment may be failing aged care

Last updated on 9 February 2026

Originally published in hello leaders print, read the full sixth print edition here.

Artificial intelligence has swept into recruitment, promising the golden trifecta: faster hires, lower costs, and better candidates. For aged care, a sector desperate to fill rosters, the appeal is obvious. But beneath the gloss, emerging evidence suggests these systems could entrench bias, shut out exactly the people the sector needs most, and leave behind a trail of invisible exclusion.

The shiny tool with a shadow

Across Australia and the world, AI hiring systems (AHSs) are rapidly becoming the norm. Chatbots now conduct first interviews, algorithms rank candidates, and video platforms claim to predict job fit. L’Oréal, Vodafone, Anglicare Sydney and countless others are experimenting. Hospitals overseas are relying on AI to sift through thousands of applications.

But as Dr Natalie Sheard, lawyer and McKenzie Postdoctoral Fellow at the University of Melbourne, warns, the very systems built to streamline recruitment risk “solidifying traditional forms of discrimination, playing an active role in creating new ones, and paving the way for intentional discrimination.” Her reminder is blunt: data is not neutral.

Maria’s fight

For Maria, 62, the cost is personal. After two decades in pharmaceutical marketing, she lost her job in a restructure. She retrained, learnt new digital skills, and meticulously tracked her applications in a spreadsheet her son set up. Out of 127 applications, 89 were rejected within hours – far too fast for a human to have read them.

“I fixed up my resume, updated LinkedIn, even learnt social media,” she recalls.

“But looking good doesn’t get you past programs that weed out based on graduation dates and career gaps.”

Her story is not an outlier. Globally, AI hiring systems have been caught downgrading CVs mentioning “women’s,” rejecting candidates who wear headscarves, and excluding people with disabilities because adjustments were deemed “too much trouble.”

Aged care can’t afford blind spots

For aged care, where older workers bring empathy, resilience and maturity, algorithmic bias is more than a compliance risk – it is a workforce threat. The industry relies on people with non-linear careers, diverse cultural backgrounds, and lived experience. If AI screens them out, the sector risks homogeneity at a time when diversity is its strength.

Sheard’s research highlights the scale of concern: an estimated 30% of Australian organisations and 42% of global companies already use predictive AI in hiring. Yet evidence of bias is mounting faster than safeguards.

“A discriminatory AHS,” she cautions, “can cause harm at unprecedented speed and scale, and has the capacity to systematically lock disadvantaged groups out of the workforce.”

A leadership test

Hello Leaders readers know that recruitment in aged care is not just about filling vacancies. It is about building workforces of compassion and competence for some of society’s most vulnerable people. AI may help shortlist and speed things up – but it cannot be allowed to replace the human judgment required to see value in non-traditional candidates.

Maria puts it best: “At 62, I’m not outdated. I’m experienced. I’m not set in my ways. I’m seasoned. I’m not too expensive. I’m valuable. And I’m still fighting.”

For aged care leaders, the question is simple: will you let algorithms decide who gets through the door, or will you insist that people – in all their diversity – remain at the heart of recruitment?

Tags:
aged care
workforce
AI