News
When AI Gets It Wrong 🤖👀
📉 The Illusion of Technological Neutrality
🔍 A Disturbing Case in Detroit
In the United States, Detroit police wrongfully arrested several individuals based on automatic facial recognition software. Among them: Porcha Woodruff, a Black woman 8 months pregnant, falsely accused of a robbery in February 2023.
⚠️ Unjust Arrest and Prolonged Detention
📌 Identified by an algorithm and a witness, she was arrested in front of her children.
📌 Interrogated and detained for 11 hours.
📌 Had to pay $100,000 in bail to be released.
📌 A month later, all charges were dropped.
⚖️ A Fight for Justice
💥 Porcha Woodruff sued the city of Detroit and a police officer for:
✅ Unlawful imprisonment 🚔
✅ Violation of her fundamental rights ⚖️
✅ Technological discrimination 📊
🔎 Algorithmic Bias Under Scrutiny
📢 Numerous studies have shown that facial recognition technology often misidentifies people of color. These errors lead to wrongful arrests and raise major concerns about the ethics and reliability of AI in public safety.
📢 Conclusion
👉 This case highlights the dangers of unchecked AI in legal procedures.
👉 AI is not neutral — it reflects the biases in the data it is trained on.
👉 Strict regulation and oversight mechanisms are essential to prevent such abuses.