Gender bias in AI: Why the industry must break the cycle

March 5, 2026

Gender bias in AI: Why the industry must break the cycle

At the 2026 Tech Show London expo, Propeller’s Associate Director, marketing and advertising technology, Alex Humphries-French, moderated a panel exploring one of the most pressing questions in technology today: how gender bias is shaping artificial intelligence and what the industry must do to address it. He was joined by:

  • Niki Chana, Founding Global Board Member of The Women in Programmatic Network (TWIPN) and Programmatic Strategy Director at SBS
  • Tsitsi Matekaire, Director, Regions at Equality Now
  • Priti Mhatre, Chief Product and AI Officer at WPP Open
  • Freddie Turner, WACL Exco Member and Managing Director, EMEA at ChaliceAI

AI is rapidly becoming part of the infrastructure that influences hiring decisions, healthcare systems, media outputs and even political narratives. As adoption grows, so does the importance of ensuring these systems reflect the societies they serve.

Opening the session, Alex set out the scale of the challenge facing the industry.

Women currently represent roughly 12% of AI researchers globally and just one in five technical roles in major AI companies, while receiving only 2% of VC funding. These imbalances shape the datasets and decision-making processes that underpin modern AI systems.

“Bias is not new, but AI scale is. It automates it and embeds it.”

  • Alex Humphries-French, Propeller Group

When bias becomes infrastructure

One of the core themes of the discussion was how AI can amplify historical inequalities if organisations are not deliberate about how systems are built and deployed.

Because AI models are trained on historical data, they inevitably reflect the patterns contained within that data. If past hiring practices or leadership structures were unequal, those trends risk being replicated by automated systems.

Speakers noted that this dynamic can quietly influence opportunities in areas such as recruitment, salary recommendations and career development. When left unchecked, bias shifts from individual decisions to system-level outcomes.

“If we put in data that has existed before, then essentially we’re industrialising the problem rather than providing solutions.” 

  • Freddie Turner, Chalice AI

The social impact of biased systems

The conversation also highlighted the wider social consequences of biased technologies.

Gender bias in AI not only affects employment opportunities. It can also contribute to the growth of technology-enabled gender-based violence, including deepfakes and non-consensual imagery.

Advocacy organisations are already seeing the effects of these technologies across different sectors, particularly affecting women in public-facing roles such as journalism, politics and activism.

“Gender bias is really real and brings a lot of harm to women and girls across the world.” 

  • Tsitsi Matekaire, Equality Now


Why representation matters in AI development

Another key takeaway from the panel was the importance of representation within the teams building AI systems.

Without diverse perspectives involved in product development, subtle biases may go unnoticed. Even systems that pass technical testing can produce outcomes that reinforce harmful stereotypes or unrealistic standards. 

The panel shared examples where AI-generated outputs unintentionally created narrow representations of people, highlighting the importance of human oversight during development.

“It’s important that women are represented at every level, because it’s not one fix and it’s not an easy fix. It’s a system fix.” 

  • Priti Mhatre, WPP Open

Turning AI into part of the solution

Despite the risks, the panel also emphasised that AI has the potential to help organisations identify and correct bias if used responsibly. 

AI systems can analyse large datasets to reveal patterns that may otherwise remain hidden. For example, they can highlight disparities in hiring decisions, promotion timelines or pay structures across organisations.

The challenge is ensuring that leaders are willing to act on the insights these systems provide.

AI can spot patterns at a scale far faster and more accurately than any human can. It’s then up to leadership to acknowledge what the data is showing.”

  • Freddie Turner, Chalice AI

Looking ahead

As AI continues to evolve, the panel made clear that the conversation around bias must remain a priority for organisations adopting these technologies.

Building fairer systems will require more diverse teams, stronger governance frameworks and greater accountability across the technology ecosystem. From the data used to train models to the people involved in designing and deploying them, every stage of the process plays a role in shaping outcomes.

The discussion also highlighted that this is not a challenge that can be solved by the technology sector alone. Governments, businesses, advocacy groups and individuals all have a role to play in ensuring that AI systems reflect the societies they serve.

And while the panel took place just ahead of International Women’s Day, Alex closed the session by emphasising that moments like this should spark ongoing action rather than a one-day conversation.

“I do want to highlight that it’s International Women’s Day on Sunday, and days like this are important because they focus attention on issues that matter. But like Valentine’s Day, we shouldn’t need a single date in the calendar to remind us to treat people well or recognise inequality. These are conversations and actions that should continue throughout the year across our industries and technologies.”

He added that while the discussion only scratched the surface of a much larger issue, it is one the industry must continue to address.

“I think we’re just scratching the surface of a much bigger topic here. But there are solutions, and it’s important that this conversation continues beyond this stage.”

Download Here

Thank you! Your submission has been received! Please download the resource below.
Oops! Something went wrong while submitting the form.

Download Here

Back to News & Views

Let's Talk

UK Headquarters

Propeller Group
60 Margaret Street
London W1W 8TF

US Office

405 Lexington Avenue, 9th Floor
New York
NY 10174

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.