Let's Talk About Coded Bias

What is Coded Bias?
Coded Bias is a Netflix documentary that explores the cons of the artificial intelligence world. Specifically with facial recognition.

What is the problem?
The problem comes from the "Bias" part. There are many different variations of bias, whether it's racial, related to sexuality, or even based on social-economic status. Anyone can be biased in those areas. The amazing goal the film captivates is showing how all 3 are being combated in today's AI atmosphere

Social Economic, Racial, and Gender | Can AI be biased with this?
Well, can AI be biased? If you take the time to watch the film you will see the answer is, "yes." As one of the researchers stated in the film, "Everyone has unconscious bias and embedded that bias in technologies they create".I would like to add to that notion, that the bias in a system comes from the human that creates it and the most important aspect of this relationship is what does this human identify as? Is the human a white male that comes from a rich background or someone like me, a black woman from a middle-class family? See where the difference lies? I believe the "unconscious bias" can come from where we grew up. We want to be able to identify with what we are familiar with. A white male from a rich background is likely used to white people and hence is used to creating products for that demographic. As for me, my products could be directed toward black people. The problem with this type of marketing is creating a product that is meant for everyone but only works for certain types of people. For example, products that require facial recognition. But see where the issue lies here? In neither of those statements does it constitute "we." We allow ourselves to stay in our own circumference of thinking that "one system = one type of demographic" this will only lead to biased systems instead of unbiased.



What issues has Biased AI caused?

Quite a few! While the documentary actually covered all that is very important to this issue in AI, I want to highlight one of them that stood out to me the most. Let's start with the Atlantic Plaza Towers in Brooklyn, New York.
Facial Recognition at Atlantic Plaza Towers

Nelson Management Group plans to install a biometric system in the Atlantic Plaza Towers. This system is a facial recognition system that was decided to be used by the landlord, which entirely excludes the tenants from the decision. Understandably the tenants were agitated by this news and wanted to put a stop to this. A start was asking the Homes and Community Renewal for more time to provide an opposition to this system.
The Problem?
This type of system pushes a negative relationship between the tenants and the landlord. That relationship would be "you cannot see me, but I watch you." One tenant received camera snapshots of her that have markings of where "suspicion " would lie in her everyday living in the complex. Implementing this system would share the message that unless proven you are not eligible to be free to have your own life outside of the landlord's view. This message is the start of the "right to invade others' privacy", this will lead to mistrust and unloyalty from the tenants to the landlord. Lastly, as one of the tenants states "He owns 12 developments. Why did he target a development that is predominantly Afro-Americans in East New York to test — and predominantly women that live in the development — to test,"
​
---Icemae Downes Implementing this system was going to push the narrative of the " right to invade others' privacy especially if your a person of color let alone black." Why implement a system that is not for the people when instead it harms their everyday living? This can be one of the reasons that tenants were angry with this plan (source: amNY).
​
​
​
​
​
​
Solution?

One word. Joy Buolamwini! A Ghanaian computer scientist from Canada who founded the Algorithmic Justice League. The film showcased her communication and endless support behind the tenants. I genuinely think her taking the chance to testify to The House Oversight and Reform Committee about how facial recognition has negatively impacted the civil rights and liberties of people of color helped the tenants have a huge support system behind them. This was part of the reason I found this situation so important to highlight. (source: House Hearing on Facial Recognition)
​
​
​
This situation hit near home to me not only because I identify with this community but having the knowledge of the hardships that people of color already face in that part of New York as well. I think this situation is one of many that support the argument: There is bias in the tech industry. Was it really a coincidence that this system was trying to be implemented in a predominantly black neighborhood? I do not think so. This situation helped me see that it is easy for racially biased people to push for systems that support their bias when there are no laws or regulations that hinder that type of behavior. That is why I am very thankful for advocates like Joy that push for these laws or regulations that protect more people like Icemae and me.
My Thoughts
