Blog

Ingrid Burrington: Oral Histories of Surveillance

It’s encouraging to see increased attention to and understanding of the extractive dimensions of computation. I think for a lot of software-oriented researchers, talking about mineral supply chains feels like something that’s not their problem, or at least not related to the problems they face. But the ideologies of mineral extraction and data extraction are very similar, right? Treating harm as “externalities,” constantly seeking new sources to extract, using promises of “transparency” to placate publics.
—Ingrid Burrington

Our Data Bodies (ODB) is excited to a series of oral histories from organizers, writers, and cultural workers whose work has been steeped in resisting discriminatory surveillance technology and racialized surveillance capitalism, and has illuminated strategies for abolition and abolitionist reform. This oral history features Ingrid Burrington, writer, educator, and artist whose work spans across art, power, technology and journalism based in Brooklyn New York.  

 


 

Seeta Peña Gangadharan: When I think of Ingrid Burrington, I think: artist, educator, journalist, researcher, and instigator. Tell us a little bit about yourself and your work.  

 

Ingrid Burrington: I sometimes joke that my working across art, journalism, and education has to some extent been out of necessity: I finished undergrad in 2009, entering a recession economy with a visual arts degree and a lot of experience in the “etc” jobs section of Craigslist. (I was, however, privileged enough do this without loan debt thanks to scholarships and a death in the family, and I absolutely have been able to pursue the career I’ve had because I didn’t have student loans hanging over me.)  

 

Finishing college in the wake of the 2008 financial crisis is also probably part of the reason my research interests have revolved around power, complex systems, and geography. Trying to grasp how mortgages in Arizona and Florida contributed to economic collapse in Greece and Iceland (and my own prospects in Baltimore at that time) kind of logically led to dissociation, which led to mapping as both a research tool and coping strategy. “Technology” kind of only became a focal point of my work because it’s a vector of power.  

 

Situating power in place and history can help with practical, tactical challenges to it (direct actions, lawsuits, etc.) but I also find it existentially helpful because it grounds otherwise-opaque, seemingly non-negotiable distributions of power in material reality and historical contingency–which is to say they could have been and still could be made otherwise. 

 

Seeta Peña Gangadharan: If I understand correctly, you teach a computer ethics course to programmers. But you’re not a computer science person, you don’t believe in ethics as a framework, and you don’t use the word ethics in relation to your own work and artistic practice. So what’s your approach? What are you teaching students about AI, automated decision systems, surveillance tech? 

 

Ingrid Burrington: Yeah, I teach a class called “Ethics of Computer Science” that’s primarily geared toward engineering students at a small college in New York City. To be clear, I’m not anti-ethics per se so much as I think it’s one of several fuzzy words that gets used when people want to talk about technology and power but feel weird about the word “power.” (“Privacy” is another word like this.)  

 

The definition I use for power in these settings is pretty broad and I think I stole it from David Graeber–basically, the ability to limit or expand one’s own options or others’ options (“others” can be people, but it could also be other life forms). Power differentials aren’t inherently bad (you might want a parent to have the power to stop a baby from jumping out a window, or for a labor union to have the power to limit retaliatory actions available to bosses) but they are generally where and how conflicts emerge. How people feel about a particular distribution of power in a given situation is often a pretty good reflection of their own sense of ethics or morals.  

 

Since technologies and buzzwords change all the time in tech (e.g., a lot of the complex statistical modeling that gets lumped into “AI” today was called “big data” five years ago), my focus with teaching students is more on mapping power dynamics that different technologies exacerbate or enable and understanding where they have–or can collectively build–power in different systems.  

 

Seeta Peña Gangadharan: What has been the most difficult and the most rewarding? And what do you think computer science/data science students need the most when it comes to catalyzing a more reflexive mindset towards the work, industry, the market? 

 

Ingrid Burrington: It’s really rewarding to see my students follow through on and apply things we learned about in class to things they’re passionate about. One of my students started an organization after she graduated that basically helps engineers find jobs working on public interest technology (and help organizations doing public interest technology recruit because like, those kinds of organizations don’t necessarily have the kind of recruiting resources of Lockheed Martin or Facebook or Goldman Sachs).  

 

A difficult part of teaching in general is figuring out how much to push back with students who are clearly operating from a place of bad faith. I only really had one Jordan Peterson stan “devil’s advocate” type kid in my class last year and there were times it definitely felt like he was itching to like, get a rise out of me because to him I was some SJW to be not taken seriously. In general since a lot of his positions were predicated on fundamental misunderstanding of facts or ahistorical analysis, it was pretty easy to not give him that satisfaction while still proving him wrong (this is also what’s useful about a power-based approach; it’s just a lot more empirically sound than the bad faith hyper-“rational” stuff so you can kind of trap them with their own rhetoric). But ultimately him getting a mediocre grade in Ethics of Computer Science isn’t going to prevent him from getting a job or getting VC funding for some shitty harmful idea.  

 

I think a big part of nurturing a more reflexive or curious mindset in students is simply modeling that reflexivity and curiosity for them, which in my experience was mainly showing up with curiosity for my students’ ideas and interests. Asking them questions about stuff they take for granted can get them asking more questions in general.  

 

Seeta Peña Gangadharan: How does your teaching/teaching philosophy connect or not connect to your On the Rocks project?  

 

Ingrid Burrington: The rocks-related stuff and my approach to teaching engineering students are both more about power and living with/within complex systems than they are about technology per se. Like, yes my students are going to go on and maybe get jobs at big tech companies where they’ll be asked to work on stuff they’re not OK with and I want to prepare them to navigate that with integrity and clarity of purpose, but I also hope they navigate their lives with integrity and clarity of purpose! Similarly, I think a lot of people get interested in mineral supply chain stuff because it’s attached to consumer hardware like a phone, which makes sense because it is an extraordinarily intimate object that’s a big part of their everyday lives. (This is definitely where I started!) But if all you take away from learning about cobalt mining in DRC is “I feel bad about my phone” you’ve really missed the mark.  

 

Seeta Peña Gangadharan: Much of what ODB gets into around surveillance, data profiling, privacy has to do with histories of racialized oppression and, to a certain extent, racial capitalism. I feel like On the Rocks implicitly makes a connection to this by talking about what Miriyam Aouragh and others call extractive infrastructure (See “The extractive infrastructures of contact tracing apps.”) Would you agree? (In general, yell us about this work and why it’s important to document and share.)  

 

Ingrid Burrington: It’s encouraging to see increased attention to and understanding of the extractive dimensions of computation. I think for a lot of software-oriented researchers, talking about mineral supply chains feels like something that’s not their problem, or at least not related to the problems they face. But the ideologies of mineral extraction and data extraction are very similar, right? Treating harm as “externalities,” constantly seeking new sources to extract, using promises of “transparency” to placate publics.  

 

I do worry sometimes about the mineral supply chains underlying computers basically becoming a prop in this larger discourse around technology and power, becoming a topic that gets a hat tip in papers and stuff but mostly used to confirm assumptions rather than expand understanding of the ways computation is and kind of has always been entangled with extractive global capitalism, going all the way back to fabricating the first integrated circuits. Not sure if this will sound too harsh but I don’t want it to become like the tech discourse equivalent of a desultory, hollow land acknowledgment–like “ok we took two seconds to feel bad about the blood in our phones, moving on to the important stuff.” Like I said earlier, it can’t just be about feeling bad about phones because extractive industry is not a static thing. There are new sacrifice zones being sized up and new scientific research projects that could totally change e-waste sourcing, and there’s new resistance and organizing work in response to tech-adjacent extraction happening all over the place. And people who want to fight surveillance capitalism or organize tech workers can learn from those endeavors! 

 

Seeta Peña Gangadharan: Can you explain what supply chains are and why we should care about them with respect to AI, data-driven and surveillance tech, the cloud?  

 

Ingrid Burrington: Broadly speaking, a supply chain is basically a process by which something–a computer, a vaccine, a t-shirt–gets made at a large and/or industrial scale. It’s the path from raw ore to refined mineral oxides to hardware components to a laptop in a store to a laptop in a pile of e-waste junk. They’ve kind of gained more prominence or public attention in the last few decades because they’re increasingly far-flung and complicated (like, think of the number of countries involved in making one Uniqlo t-shirt or something–it’s more than you think!) and this year has made it really apparent how brittle they actually are (remember how hard it’s been to get personal protective equipment to healthcare workers or how there just weren’t enough ventilators for hospitals in the spring? That was partly because of supply chain breakdowns!).  

 

When it comes to cloud-driven softwares and tools, the supply chains can be helpful for understanding where to put political pressures or how to challenge a system. Stop LAPD Spying’s work on PredPol is a good example of this–they mapped out the process by which some really specious UCLA research became a company and major vendor to police departments and all of the public and private interests who benefitted from the company succeeding despite its efficacy being really questionable.  

 

Seeta Peña Gangadharan: How would you like to see the insights from On the Rocks filter into your teaching… and into the practice of others, including people like us at ODB? 

 

Ingrid Burrington: This is maybe just me continuing to be a contrarian who’s over the broader tech ethics industrial complex, but one thing that I find really helpful with the rocks work is it’s a good reminder that all of this computational stuff–surveillance, AI, data collection–all that stuff people like you and I get super stressed out about is actually pretty small. It matters, of course, but it’s just one piece of a big messy thing. It’s not actually the only thing. Which is important for keeping egos in check and remembering what I’m actually interested in and want to work toward. 

0