top of page

Identifying an injustice in Apple's AI algorithm

I have written about bias in AI before. Well, now I have seen it first-hand using my iPhone.

Here’s the backstory:

It was Halloween in Chicago, and it snowed pretty heavily all day. Overnight, there were freezing temperatures, resulting in black ice on the sidewalks and streets. (For those of you who live in warm climates, black ice is a thin coating of ice that you can't see.) I wrote to my family early in the morning when I almost slipped on the street to warn them about the BLACK ICE.

You can see in the screenshot the injustice in the AI-generated suggested words when I typed, “Hey fam, be careful black…” ICE.

The Algorithmic Justice League is doing incredible work to bring these issues to the fore and make positive change. But BIG TECH also needs to take responsibility for its role in perpetuating bias through its AI algorithms and take action to remove the biases in its training data.

If we are going to harness the potential of AI for positive social impact, we need to urge collective action to ensure AI works for all.


Recent Posts

See All


  • LinkedIn
bottom of page