parallax background

AI AND INEQUALITY AT WORK

TECHNOLOGY IS CHANGING HOW WE WORK, BUT NOT ALWAYS FOR THE BETTER. IT CAN TREAT SOME PEOPLE UNFAIRLY BECAUSE OF WHO THEY ARE.

AI SYSTEMS OFTEN REPRODUCE EXISTING BIASES DUE TO THE DATA THEY ARE TRAINED ON AND THE PEOPLE WHO DESIGN THEM.

parallax background

WHO’S AT RISK?

BLACK, ASIAN AND MINORITY ETHNIC WORKERS

Biased training data can result in hiring tools that unfairly filter out candidates from these communities.

DISABLED WORKERS

Automated assessments may overlook non-visible disabilities or apply unfair assumptions about what people can or can’t do.

OLDER WORKERS

May be unfairly fl agged as ‘slow’ by AI screening systems.

YOUNGER WORKERS

May be filtered out by narrow definitions of “experience.”

WOMEN

Algorithms often value male traits, and women may be penalised for tone or communication style, while a male colleague isn’t.

LGBTQ+

Predictive behaviour tools might misinterpret LGBTQ+ identity expressions as non-standard or risky.

RELIGION OR BELIEF

Workplace monitoring tools can ignore religious needs, such as prayer breaks.

NEURODIVERGENT WORKERS

Systems that expect fast, uniform responses might flag someone unfairly for pausing too long on a call.

WHAT CAN WE DO?

parallax background
parallax background