Bias in, Bias Out: What AI Gets Wrong with Women with Siri Swahari (CGI) copertina

Bias in, Bias Out: What AI Gets Wrong with Women with Siri Swahari (CGI)

Bias in, Bias Out: What AI Gets Wrong with Women with Siri Swahari (CGI)

Ascolta gratuitamente

Vedi i dettagli del titolo

A proposito di questo titolo

When a group of powerhouse women at the MIT AI-Powered Women Conference asked ChatGPT one simple question, “What do you think I look like?” the results were… revealing. And not in a good way.

In this live episode from Boston, we sit down with Siri Swahari, Vice President & Partner at CGI, to unpack what happened that night, why AI keeps defaulting to the same stale stereotypes, and how leaders can push for systems that are actually fair, representative, and useful.

Siri breaks down:

The viral moment when AI generated bald white dudes for women of color

Why the problem isn’t “men vs women”… it’s the data

What organizations must fix before unleashing AI tools on their teams

How bias sneaks into models — even when intentions are good

Why representation, guardrails, and real stress-testing matter

What leaders should ask before adopting any AI solution

Why she’s cautious about AI’s pace — but still optimistic

And her work building The Women Executive Circle, a growing cross-state community supporting senior women leaders

If you work in HR, ops, tech, policy, or anywhere AI touches people — this episode will hit home.

CTA

👉 Watch the full episode on YouTube: https://youtu.be/4D78uhi-rw0

Ancora nessuna recensione