AI, Ethics, and Geoethics (CS 5970)
Module 8: The Power Chapter
- (60 min) Read the chapter
- (30 min) Discussion and case-study on slack
- (5 min) grading declaration
Matrix of domination
Image from “Chapter 1: The Power Chapter”
Discussion and Assignment
This discussion will happen in the #data-feminism channel. Remember to use threads so that we can keep track of the conversation more easily.
We are going to do two things for this chapter. The first is our more traditional discussion based on the chapter and quotes from the chapter. The second is that I want you to do an assignment based on the who questions in the book.
As with previous chapters, I have a few quotes to think about and discuss. As before, feel free to find your own quotes and discuss as well.
- “Why did it take the near-death of an international sports superstar for the media to begin paying attention?” (page 4, Chapter 1, Data Feminism)
- “When data teams are primarily composed of people from dominant groups, these perspectives come to exert outsized influence on the decisions being made – to the exclusion of other identities and perspectives. This is not usually intentional; it comes from the ignorance of being on top. We describe this deficiency as a privilege hazard.” (page 9, Chapter 1, Data Feminism)
- “… the biggest threat from artificial intelligence systems is not that they will become smarter than humans, but rather that they will hard-code sexism, racism, and other forms of discrimination into the digital infrastructure of our societies.” (page 9, Chapter 1, Data Feminism)
- “The phenomenon of missing data is a regular and expected outcome in all societies characterized by unequal power relations, in which a gendered, racialized order is maintained through willful disregard, deferral of responsibility, and organized neglect for data and statistics about those minoritized bodies who do not hold power.” (page 18, Chapter 1, Data Feminism)
- “Far too often, the problem is not that data about minoritized groups are missing but the reverse: the databases and data systems of powerful institutions are built on the excessive surveillance of minoritized groups.” (page 18, Chapter 1, Data Feminism)
- “This model, like many, was designed under two flawed assumptions: (1) that more data is more better and (2) that the data are a neutral input….data are never neutral; they are always the biased output of unequal social, historical, and economic conditions” (page 19, Chapter 1, Data Feminism)
- “How did we get to the point where data science is used almost exclusively in the service of profit (for a few), surveillance (of the minoritized), and efficiency (amidst scarcity)? … If science, surveillance, and selling are the main goals that data are serving, because that’s who has the money, then what other goals and purposes are going underserved?” (page 21, Chapter 1, Data Feminism)
Short Research Assignment/Case study
Your research assignment for this chapter is based on the following quote:
who is doing the work of data science (and who is not)? Whose goals are prioritized in data science (and whose are not)? And who benefits from data science (and who is either overlooked are actively harmed)?.. Asking these who questions allows us, as data scientists ourselves, to start to see how privilege is baked into our data practices and our data products. (page 6-7 of Chapter 1: The Power Chapter of Data Feminism)
Think about your favorite interesting hard problem (this could be your project, it can be your research, it can be something you care deeply about!). Go to the the #case-studies channel and give us a description of your problem and then answer the who questions about the data that is (or isn’t) being collected to address this problem. You don’t need to go into the depth that they did in the book (they take a whole chapter!) but make your answer a paragraph or two.
OU students: After you have done your reading and engaged actively in discussion, complete the grading declaration titled “Module 8: Data Feminism and Power”