Home » Module 9: The coded gaze » Module 9: The coded gaze

Module 9: The coded gaze

Hi! Hello!

It’s so funny, I talk so much about access and being kind to yourself and yet I feel this innate urge to apologize for being unavailable last week. So I guess I will say I’m sorry because I do really care about your learning space and I hope that comes through. I think that it does. But this semester, like most or all semesters I would argue, is untenable because the conditions that we’re expected to perform under are unrealistic for any bodymind (bodymind is a Disability justice term). So for any bodymind that is looking to have any amount of joy for the time that we’re here on this planet earth — yeah, it’s rough out here people. I apologize for the rant, I can’t help myself. I also don’t understand why we’re expected to be productive constantly. But that is another byproduct of white supremacy culture. So. All the more reason to flip the script.

Please DM me if you need any support or feel lost or just want to say hello. I’m here for you and hope this asynchronous space can still feel human.

Let’s jump back into thinking critically about the fields inside engineering and the sciences. This goes for everyone, but especially for Computer Science majors — have you considered the ways in which your field has bias? the ways your field has a profound impact on how society is shaped?

I’m not sure if these questions are being raised in your other courses (I hope they are! Tell me if they are!) and since we’re considering both rhetoric and composition, these questions must be taken into account. 

For this week, I would like you to watch this 13 minute talk by Dr. Joy Buolamwini about facial recognition and the effects when the sample set skews white and male.

For the module comment, I would like you to consider the following:

Take note of 2-3 rhetorical issues Dr. Buolamwini raises that speak to you. For me, it was her reframing of the “under-sampled majority” as a way to think about who is represented in most technological spaces and who is erased. So often we say “minority” when speaking about the people of the global majority who are not white and that set standard creates an intentional bias which has real implications (think policing, thinking community funding, think incarceration rates).

Have you ever considered algorithmic bias when using your devices?

What are some ways we can shift the dominant data set?

If you have an experience of algorithmic bias that you want to share, I welcome it in this space but it is not required.

Thanks everyone for staying engaged and enjoy the rest of your week!


2 Comments

  1. I noticed she used the phrase Predictive Policing. I have never heard that phrase before, and based on my understanding, it means preemptively enforcing the law based on assumptions. This phrase helped me realize (I always knew this, just never thought about it) that preemptive policing is inherently biased. The stereotype of angry black women or gangster black men incarcerates/harasses countless black people because of preemptive police judgment.
    I also noticed she used the word pale instead of whiter/lighter skin. I honestly do not know why she used this word instead, but I imagine it had a specific meaning for her.
    I have considered that algorithmic bias is rife in the devices I use. I have never treated any device as all-knowing or impartial, like a robot, because humans are the ones that created it. Naturally, A.I. makes specific choices based on data that humans encode. It is not making a natural decision but a forced one. The only way I can think of shifting the dominant data set is to get more people from different backgrounds and experiences to work on the data. That way, we have different inputs.

  2. I never really considered algorithmic bias when using technology devices however after watching this video I realize it is a pretty important topic. Especially when she bought up the fact that some people get mistaken for being criminals because of poor facial recognition systems. Some ways we can shift the dominant data set would be to make sure large tech companies are making this a priority.

Leave a comment

Your email address will not be published. Required fields are marked *

Course Info

Professor: Andréa Stella (she/her/hers)

Email: astella@ccny.cuny.edu

Zoom: 4208050203

Slack:engl21007fall22.slack.com/