top of page
Search

Identifying an injustice in Apple's AI algorithm

I have written about bias in AI before. Well, now I have seen it first-hand using my iPhone.


Here’s the backstory:


It was Halloween in Chicago, and it snowed pretty heavily all day. Overnight, there were freezing temperatures, resulting in black ice on the sidewalks and streets. (For those of you who live in warm climates, black ice is a thin coating of ice that you can't see.) I wrote to my family early in the morning when I almost slipped on the street to warn them about the BLACK ICE.



You can see in the screenshot the injustice in the AI-generated suggested words when I typed, “Hey fam, be careful black…” ICE.


The Algorithmic Justice League is doing incredible work to bring these issues to the fore and make positive change. But BIG TECH also needs to take responsibility for its role in perpetuating bias through its AI algorithms and take action to remove the biases in its training data.


If we are going to harness the potential of AI for positive social impact, we need to urge collective action to ensure AI works for all.


 
 
 

Comentarios


  • LinkedIn

Let’s Work Together

We would love to hear from you!

​​¡Hablamos español!
 

joanna@hiloconsulting.org

+1.312.375.0682

For more information or to speak with us directly, fill out the form or complete the questionnaires below.

We look forward to learning more about your work!


 

Send Us A Message

Thanks for reaching out!

Unlock Your Organization's Potential with Hilo Consulting LLC

bottom of page