Week 7: The smart city is watching you, or These walls have ears
ANTH 3608: Becoming cyborgs—Technology and society (Semester 2,
2025)
September 19, 2025
Main reading: Kang and Hudson (2024); Stucky (2025)
Other reading: Brayne (2021)
Notes
Automation and algorithmic decision making are watchwords of late, and I don’t think it will be surprising to learn that they can be biased. Garbage in, garbage out. An algorithm is only as good as its prior, preconceived ideas about its data. Of course technology in law enforcement results in racial profiling! This weeks readings allow us to extend and complicate our understanding of the socially embedded nature of technical infrastructures. Yes, these systems reflect their origins in a particular place and time, and more pertinently the biases of the dominant group within the society that uses them.
Well if it’s so obvious that these systems are flawed, why do people keep using them? (Seriously, why spend millions of dollars on gunshot microphones? You could just flip a coin to decide whether to dispatch a car, since that would be more likely to stop crimes.)
in a network perspective, we shouldn’t be so hasty as to reify the social as the source of systemic inequality. As we have come to learn, there is no big abstract collective consciousness, there’s just recursive loops across heterogenous nodes in a network. Policing technologies are the diffusion of responsibility for the biases present in policing.
Keywords
listening, surveillance, ground truth
Learning outcomes
- Be able to explain the technical concept of a “ground truth” in terms of sociomaterial networks