Blog

Timnit Gebru: Oral Histories of Surveillance

This field is pretty terrible you know [laughs]. It’s the combination of people thinking they’re so smart and objective, and that combination just really gets to me, like, it’s this combination. Regressive is the right word.
—Timnit Gebru

Our Data Bodies (ODB) is excited to present a series of oral histories from organizers, writers, and cultural workers whose work has been steeped in resisting discriminatory surveillance technology and racialized surveillance capitalism, and has illuminated strategies for abolition and abolitionist reform. This oral history features Timnit Gebru (she/her), researcher and computer scientist based in the US.

 


 

Kim M Reynolds: Hey there Timnit. So you’ve been in the field of computer science and computer vision for a really long time, can we start with you reflecting on your field and your work? 

 

Timnit Gebru: This field is pretty terrible you know [laughs]. It’s the combination of people thinking they’re so smart and objective, and that combination just really gets to me, like, it’s this combination. Regressive is the right word. You know, and it’s the funny thing is that people think it’s like the opposite, right? But yeah, I just thinking was like, “Oh god, what do I do in this field? Do I stay?” Especially since I was in computer vision, I was just like, “I’m out. I’m done.” I actually sent an email to Joy saying “I am done! I am done with this field!” and I thought she’d be like “Oh yeah. Leave girl.” But she was more like, “Well, you know, look, change doesn’t happen overnight.I understand you’re tired, and you need a break.” And that makes sense, but she wasn’t like, yeah, “Exit girl. Leave.”

 

Within the AI field, there are different subfields, right, so when I look at the different subfields, I look at people in natural language processing, and how the kinds of conversations that I have with people in that field about culture or their proximity to other fields like linguistics, is so different from the kinds of conversations I have with people in the majority of computer vision. I think, Devin, who, who is also on the board of Black in AI, wrote this 12-page paper about combating anti blackness in the AI community, and this is a thing where you would expect the academic community to speak about. A lot of people talk about the corruption of research by industry and stuff like that, so you would expect the research community, the academic community, to take some sort of stance, say something right? No. I mean, how is it that you have companies coming up with like stances [on tech, AI, facial recognition technology] and the academic community says nothing, I find that to be ridiculous. I would expect it to be the other way around, you would expect the academic community to be pressuring industry and things like that, but you know, you have these conferences where people are not saying anything. And it’s just, it’s just so strange. And so that tells me how backwards this research community is. And then the way they describe me as not as a researcher, apparently, a lot of people describe me as an activist, which is fine, but you know, my first and foremost description was that I’m a researcher, right? So just because I’m talking about this stuff, how people are creating things that are deployed in the real world and affecting real communities, somehow mentioning that automatically makes you an activist. So it’s that paradigm. And for me, that whole entire paradigm permeates the academic community, industry, everybody. 

 

I don’t remember who asked me, but someone asked about academia versus industry or something like that. And that was just like, honestly I think this dichotomy, at least in my field, is false, because the same exact people go into industry, their students go between academia and industry, right, and they collaborate together. Then like some funding entity who’s gonna get that $20 million NSF [National Science Foundation] grant or whatever. It’s the same people who make that decision, right. So for me, it’s like the same paradigm is guiding everything.   

 

So anyway, how is it that you know, this community has not taken a single stance on surveillance, when we’re in the midst of this moment [uprisings in the US]? Journalists are talking about, you know, the use of face surveillance everywhere- they’re talking about Uighur in China, they’re talking about Black people here- and the academic community, quiet. And then when you say something about it, “Oh, you’re an activist.” I mean the bar is so low.  

 

But, yeah, I mean, so it’s kind of a struggle, like, you wonder…I feel like I’ve had some success with some people in terms of like educating them or changing their views on things in their paradigm. But it’s not like it takes so much effort to just change one person. And it’s like, and it happens at the cost of, you know, just so much exhaustion. Um, one of my friends was saying, he’s Black and Brazilian, “You know, we should just concentrate on having our community having more power,” because when people have more power, they have to listen. Even now, like even this whole thing with Jan, people only listened because I had some amount of power, some amount of visibility, some amount of community that was backing me right? Like if it was the opposite, it doesn’t matter, like, nobody would have paid attention. So he was like, you know, we should just focus on giving our community like, more power, because that’s the only way things change. Otherwise, like, I don’t know. I was like, “You’re right.” That’s what people have been saying for a long time.  

 

Kim M Reynolds: And it’s, it’s a question I’m actually interested in asking you is just kind of negotiating. Because when you’ve worked for these companies, right, like, in terms of like, Apple, Microsoft, Alphabet, like, how, and I mean, these companies are like, like, major drivers of like, racial capitalism. How do you negotiate working in the spaces in in knowing that there’s such hostile environments for Black people? I would say particularly, probably for Black women and Black queer people, Black gender, or gender non-conforming people. Yeah. And now I’m like, how to build community within that? 

 

Timnit Gebru: It’s actually hard. There’s multiple negotiations that need to happen. First is like, how do I maintain my voice and not get subsumed by one of these big companies? Because every time something like this happens, the PR people want to be involved, they want to know what I say, what I don’t say, whats in my contract, what’s allowed, what’s not allowed? What kind of risk do I take? And which risks are worth taking? When do I take a risk that I think is important? Versus when do I not take a risk? You know, a lot of times, for me, that type of negotiation has to happen. It’s very important for me to have my voice externally because actually, that’s also like, I would say, an insurance mechanism. Visibility is very important and unfortunately… it’s like, if I just got off of Twitter, I wouldn’t have any of  this kind of drama on Twitter, right? But then at the same time, I wouldn’t have visibility, I wouldn’t have my own voice, I wouldn’t be able to talk to some people that I do on Twitter. I think that’s actually very important for Black women because when something happens, you want other people to be watching, and you want to control the narrative of what’s going on. So many times, you’re encouraged to do the complete opposite. Like me, so many things have happened and I’m usually very loud about it, like, very loud, and they kind of don’t want me to be right. Like, they don’t want us to be loud in public.  

 

Microsoft Research was just one year postdoc thing, and I was doing my thing and some isolated like a New York office. I wasn’t necessarily… I didn’t know that much about what was happening the rest of the company or what, what happened. Though for example, I would be going around giving talks about predictive policing, and how bad it is, and at some point, I read about 2012 and how Microsoft had some contract with the NYPD. And I’m like, What’s the contract? What are they doing? Or I would go around on my stance facial surveillance, but they were still doing it at that time, right? Like they weren’t really limiting the use of facial surveillance. So in some ways I think sometimes they use their research, or I was doing my independent research and they didn’t police me, but I think sometimes that is used as a cover for the company to just do whatever they want. So, okay, so how do you maintain your voice, and not feel like you’re being used as a cover? But then also, can you make any changes inside? So I’ve had a lot of people tell me like, “Look, these are huge institutions, maybe what you can do is divert some of the resources to build our own institutions, you can’t change these institutions.” A lot my friends like to say that, and I go back and forth, and, you know what, you’re right, like, this is just too exhausting, I can’t spend my energy doing this.

 

But then at the same time, I feel like I’ve created a little bit of a safe place for a small number of people, a small group of people who I think, have a little bit more freedom to, let’s say, want to say something against certain practices, let’s say in the computer vision industry or something. Things that are like very terrible for queer people, for example, we’re talking about computer vision, and surveillance, stuff like that- like creating a safer nugget, like a safe kind of corner, let’s say, of people who can do that. At this moment, I think that’s valuable, you know, for  people who were maybe at a different place, and they were just like, harassed, gaslit, and so on. And so they care about certain things that I do as well. And so I feel that that is useful, because it can retain the people who push to organize.

 

And so that’s my current thinking, right? But you know, my thinking evolves, right, maybe a year later, two years later, I may feel like I couldn’t even stay in these spaces any longer. But that’s sort of that’s sort of my thinking right now is that like, if I can, have a little bit of a nugget, or a kind of a corner, for the groups of people that I’d like to see empowered in the tech industry, and the groups of people who push back, I’d like to see that. And then if I can divert some resources from the industry to like other communities, I’d like to be able to do that. Even in the midst of these huge, multinational companies, if I could, somehow work to make sure that they’re less harmful… Let’s say so like, for example, if within surveillance, if I could somehow work partner with like, various groups, like there’s inside outside of partnerships, right, like when you’re inside and sometimes when people on the outside, shine a light on something, it helps the people on the inside trying to try to work towards a specific goal, right? Like so many times, of course, people are resistant, if it’s something that costs them money, obviously, companies that care about stock prices, you know, are not going to listen, right? But bad PR on the outside, like journalists kind of paying attention to stuff and like civil rights organizations pay attention to stuff helps steer things in a better direction. 

 

So that’s, you know, that’s sort of my thinking, where could I make a little dent, like a small little effect, and that’s sort of my thinking. Sometimes I think about, you know, being an academic. And I’m just like, look at these academic institutions, I don’t know if I’d last at these academic institutions. Right now honestly, the only reason I’m still at Google is because my manager is very supportive, and he protects me. So he uses his power to protect me, but it took us a long time to get to this place, like it was a lot of friction. There was a lot of fighting and stuff like that. And so for him to kind of get to a place of understanding took a long time, a lot of energy, and a lot of real problems. And then also because of my team, I have been very supportive team, and I… and it’s mostly consistent of the people I’d like to see empowered in this community. And so I’m thinking, “Okay, like, if I were to go into an academic institution or something like that, I would have none of that, and it would be even more lonely.” And right now, what I’m doing… I didn’t see any of these academic researchers, or anybody, really taking a stance against any sort of surveillance against marginalized communities, which a lot of it is being powered by computer vision, of course, you know, research from the computer vision field. So that’s, you know, I don’t know, that’s my thinking right. Now, if I leave the companies, I think I would not go into another institution. Maybe I would do my own thing. It’s just difficult for me to think about jumping to another institution at the moment. 

 

Kim M Reynolds: How do you achieve clarity of your politics between abolition and reform?  

 

Timnit Gebru: You know, it’s very interesting, I guess, having your own clarity. I think that’s why I always feel like, that’s my current thinking and I will see how it goes later. And it’s not like the black community has a united ideology, you know, everybody has like a different view, right?  

 

Some people think, okay, like, we have to build wealth, like, we have to increase the number of black founders. You know, “Only 97 of the founders in Silicon Valley are white and Asian people, etc, etc.” And then other people are like, well, but if those black founders that we increase are still kind of advancing this system that oppresses the rest of the Black community, then isn’t isn’t that wrong, right? You know, so maybe these are all things that we should probably work on simultaneously. What about like, black people in the military, black people trying to, let’s say, change things in the military? Something like, what is my position on that? Right? And I mean the military…

 

And then I think about Eritrea. I’m of Eritrean descent, and I was born and raised in Ethiopia, and I left. And when I go there, if I talk about, algorithmic colonization, sometimes it feels, it can feel preachy. It feels like a colonial mindset, in some ways, actually, because, okay, here I am, I’m working at a big company. I am in the US. And now I’m back, and I’m talking to people about brain drain. That doesn’t sound right, right? I want to do is I want to say, what my thinking is at the moment and give people.. you know, people should have people should have the same opportunities. I would say, in the Black diaspora, we are not all on the same page. Because during my mom’s generation, she would tell me about how they used to protest every weekend, at the university she went to in Addis. They used to protest about Rhodesia, or apartheid in South Africa.

 

I can’t imagine people in Ethiopia right now protesting, but something that’s not in Ethiopia, right, like solidarity with some other African country, right. I feel like back in the 60s, there was a lot more solidarity among the African diaspora and understanding of how white supremacy affects everybody. You know, and right now, I think about all the Eritreans who die going out and die on boats. You know, for me, I mean, that’s very much tied to white supremacy. Like, of course, right now, it’s the conditions, like it’s the government that’s there right now. But if you look at the history, all of it has to do with colonization white supremacy. Why do they have to die trying to get to Europe to the Mediterranean? Right? Like, why? Why does that income inequality exist? Why is it so hard for Africans to get visas anywhere? Why was there a war between Ethiopia and Eritrea to start with. It’s because the Italians colonized.  

 

But I don’t think people think about that, right. I don’t think we think about the roots of some of these things as being white supremacy. And so one of the things that I think it’s important for us to have a more accurate understanding of our history. And understand the roots of white supremacy. I was reading Trevor Noah’s book, and he was talking about that in South Africa too, like, divide and conquer was so much easier. And also the same thing with people wanting to be colored because it’s closer to power, right? All over East Africa I see this, where, let’s say, you know, take just Ethiopia as a place, there is a huge hue of colors, right? There are really right, light skinned people, really dark skinned people. 

 

But a lot of East Africans that I know here, who immigrate to the States, and they don’t want to be considered Black. They don’t. They want to choose “other.” It’s this wanting to align yourself with what you think is power. Right? Like what you think is… is better. It’s similar to what Trevor Noah was talking about with colored identity, you know; people in South Africa, and they were classified as colored and were like, “Well, at least I’m not Black.” So it’s this mentality because people have learned such a skewed history. And it’s so strange. I mean, if you look at my grandfather’s generation, they were still colonized by the Italians. And then my dad’s generation, they still had ties with Italy, like spoke Italian. My dad went to school in Italy, and they were taught by Italians and stuff like that. And it’s this weird thing where they think Italians are racist, and they talk about stuff like that. But then they also talk about… one of my uncles one time talked about how the Italians quote unquote, “civilized” us right? I’m like, are you kidding me?

 

But I know during my great grandfather’s time, people resisted. People resisted colonization. They did not feel like they were being civilized, right? Like, they have to sit at the back of the bus, they couldn’t go past fifth grade and stuff. So it’s just, like, we all learn this white supremacist version of history. I mean we learned that Christopher Columbus discovered America. So, again, these people will come and don’t want to say they’re black or whatever. But then I’m like, do you understand that just the issues, the huge issues that you’re facing, all of them migration, people dying on boats, that is because of white supremacy? And you’re aligning yourself with, quote unquote, “not Black” or whatever because you’ve been taught that the thing to align yourself with is aligning yourself with your own oppressor, right.

 

There’s a lot of work to be done for just, I think, like, a more accurate representation of history to be learned by many of us in the African diaspora. And that’s actually one of the things we’re trying to do in Black in AI. A lot of people have written works on decolonial AI, and stuff like that, to create like a decolonizing stream/ or field. Black in AI is very a global organization, right, we have people from all over the African diaspora. Well at least we can start with our own community about how white supremacy is affecting all of us. 

 

Kim M Reynolds: What is the possibility of decolonizing AIWhat does the combination of even those words together mean?  

 

Timnit Gebru: I think I think there’s a difference between decolonizing institutions and fields, right. So when people say decolonize, science or something. Sure, you know, science is not a Western thing, and people want to talk about it as if it’s like a Western equation, but it’s not right. It was like it was created throughout the years with lots and lots of cultures. Or math, people talk a lot about algorithmic bias, and the term algorithm is an Arabic word. So I think that I think first about a paper called Decol AI, where they wrote about this: What is just what does it mean to decolonize AI? They were trying to see how decolonial studies would apply to this and how to look at it from this lens. And so what does it mean? [In the paper] they have various kinds of specific proposals and things like that.  

 

But for me, it’s honestly like starting out really, really basic, which is questioning assumptions, understanding our history, a shared history. The other day we were having a Black in AI social at this Machine Learning Conference, and one of the things we were doing was just talking about decolonizing AI, what does it mean? So we invited some of these people who’ve written about these works. And so so like one person, who wasn’t there, Sabelo, wrote about ubuntu, and how relational ethics is different from like this, you know, Descartes “I think therefore I am” kind thinking which is currently used in AI. So looking at how, or tracing that line of thinking and how it would look different, if ubuntu were the source of it instead. I always talk about how like, the French are raised to think that they gave the world everything. You know, the best thing since sliced bread. The best art. The best science. Best whatever. And they’re not raised to be so upset about how they have really caused so much harm to the world through colonialism and colonization, how they continue to colonize the world. That’s not what they’re thinking, right? Like what they’ve done to Haiti. And everybody… they just grow up with this source of pride about being French. Right? That’s how we grow up. That’s not how we grew up, we grew up wondering what’s wrong with Africa? 

 

And then and then people’s vision of what the future should be, or like, what development should be is always a Western vision of like, you know, of high rise buildings or whatever. So the question for me is how do we start thinking about our history in a more accurate way and what we got given to science, and how what we have to offer is so much better sometimes. Even when people think about justice, mostly, a lot of times they’re thinking about.. some people think, oh, like, it’s good to have you been like this, at least, in the US, they have justice, they have a working justice system or something like that. And I think, well a justice system that like, you know, just criminalizes people, and then they can’t even… it’s like, once you’re in, you just can’t get out of the fact that you were in jail and what that means to all these systems.  

 

So when we were doing this social, one of the things we did while talking about what does it mean to be colonized, was share writings or poetry in different languages, like whatever people wanted to share: music and art and stuff, because for us, I think it was part of what it meant to be colonized it because it was like, this is not what you’re supposed to do in an academic space, that’s discussing AI, that’s not normal in that conference to do that, right. Like, it’s not considered a way to have scientific discourse. That’s not one of the ways in which you’re supposed to talk about science. Whereas in our communities that we grew up in, you know, like art and music and poetry and stuff are so they’re so integral to any sort of gathering. And when people remove them, it’s usually because they have been taught that in order to do something like, the way in which you discuss or have discourse, that the types of works that you value are not that. So I think even just little things like that can be a step.

 

So, yeah, what we did was we had a social where we invited people who wrote some of these strings of words about decolonial AI. Also, for example, Devin had a paper where he wrote Combating Anti-Blackness in the AI Community. And then these three people had written Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial IntelligenceSo we were all like, well, let’s talk about what this would be. What? What is it? What does it mean to decolonize AI? Is it even possible? Like, how do you do it? So we just had like a two or three hour kind of social where, like, we were discussing this kind of stuff, but also people would just introduce each other and share their favorite piece of music or painting or poetry or something.  

 

So I thought just having that space, where this is how you discuss science and we’re all scientists and where this is really not a thing you’re supposed to do in these academic scientific spaces. So that’s sort of how we started. And so what we are hoping to do is collate some works and create a series or something for us, for people and Black in AI can be educated about our own history and how it relates to science and how it relates to technology. 

 

I don’t think I have an answer for what does it mean [to decolonize AI], but it’s like, how do we start thinking about it? Well, first, people in the African diaspora, we should find each other. And we should think about what it means to do this, and we should understand our history to start with. And how that relates to this, and how we can imagine a future that’s not just Westernized. When I talk, or when I teach, I’d be saying Kigali or other places on the continent, a lot of people what they want is the resource, you know, resources. A lot of people want to do science, they want to work at big tech companies. They don’t have resources for these things. Because, you know, a lot of the resources are, are eaten up by, you know, other other places, right? But and so I can’t go to that be like, “Oh, don’t think about that.” I can’t do that. So it’s a combination of how do you try to redistribute resources, but then also those of us who are these really powerful institutions and others: how do we think about our own histories, our own contributions to what’s happening to our own people? And how do we kind of think about how we shouldn’t always align ourselves with power. We found that in Black in AI, there are a lot of people who are specifically thinking about this. Why not, you know, have resources for our own community first, before we go out and say to anybody else, to educate ourselves about our own history and how it relates to technology? 

 

Kim M Reynolds: What brought you to the work of computer visionAnd how does that intersect with your own questions about racial, social, and economic justice? 

 

Timnit Gebru: I took a very roundabout way to computer vision. I was doing electrical engineering. So I was doing analog circuit design, which is just circuit design. My father was an electrical engineer, and my sisters are electrical engineers. So I didn’t really think about, you know, what to do. I was just like, I like math, I like physics. And then I also played piano for a long time, and so I took a class where I was able to work on audio circuitry for music and stuff for that. I worked at Apple and the audio group, because it was a lot of people also interested in music. So then I came back to grad school, and I was meandering around, doing more circuit related stuff. And then I worked with Professor Audrey, who was working on something called optical coherence tomography, so it’s imaging for the eye, like medical stuff. So I thought it would be cool to work on low cost imaging applications, like medical applications. Let’s say if you, you know, wanted to have some sort of device that can be used in rural areas or something like that. And so I started working on that started getting more interested in that imaging side of things. So I took a class and then I got more interested in computer vision related stuff. And then I left because I was just, I thought, okay, this isn’t for me, I was too isolated and stuff, and that was in 2011.

 

And then I did a startup just random, just because, you know, I kind of got brainwashed. Right? When you’re at Stanford, that’s kind of how you think startups are cool. You got to go start one [laughs]. So I did that. And then in 2012 I was doing a startup. I was still auditing classes on machine learning and stuff like that. So I did that and then I came back to school, because I was thinking about what I wanted to do, what I was going to do with my life, blah blah blah. So I came back to my PhD in computer vision. And at that time, I was not thinking about it. And I’ve always done other stuff in terms of, you know, in terms of diversity, inclusion or whatever. Like I’ve spoken up about things and stuff like that back. But I never thought about the relationship between computer vision and marginalized communities, not at all, when I was doing my PhD, not not once. I just thought, this is a technical thing. It’s engineering, you know.

 

And then what happened was that, you know, when I went to grad school, I saw like, oh, my god, like, there’s basically no Black people here, none. And then I go to these conferences, and it’s, you know, thousands and thousands of people, like seven, eight thousand, people, and you see one or two Black people. And it’s just like, oh, my god, people talk about the lack of diversity in the tech world, and this is even way worse, its like nobody. And then. And then I was thinking about drones and I’m just like, oh, god, like, you know, this technology, who are they gonna classify as a terrorist, like, this is happening.

 

And then, at some point I got introduced by my friend Jess, a Rhodes Scholar, she’s a South African, and she said there was some email from Joy Buolamwini about looking to partner with someone who is doing computer vision or something like that. So she introduced me to Joy. And I was like, looking at her talks and stuff, and there was a piece about detection, and facial detection and how certain software was not detecting her face. And I was like, wow, I didn’t know this. So at that point, AI started getting really, really hyped. When I started working on my PhD in 2013, it wasn’t like that at all. So the combination of the hype, and the fact that there weren’t black people, and then the people who when I try to talk to people about my experiences with police and stuff, they weren’t understanding it at all. And then the Propublica article that came out in 2016, about crime recidivism. I didn’t know these tools were being used. That’s when I started, like, I think around 2015, 2016 is when I started to, like, see that these two things are tied. And I really didn’t think of technology as a tool for domination. I don’t know how to explain it, but the way I was taught about science. When I thought about science or when I thought about  science and injustice, it was all white guys, that was thinking about, actually, that was it, there is sex in the race, and that’s the stuff I was thinking about. I wasn’t thinking about how it was being deployed at scale, in matters that could be harmful to people. That, I thought about later in my PhD. 

 

Kim M Reynolds: Can you just give us a brief definition of exactly what computer vision is? 

 

Timnit Gebru: Computer vision is, I would say, a subset set of AI, which deals with imagery to help machines kind of understand how to kind of interpret images. When you look at an image, let’s say a backyard or a tree, computers don’t really, can’t necessarily interpret what’s happening. And so that’s, that’s what computer vision tries to do. 

  

Kim M Reynolds: How does your work have this vision of kind of abolitionist democracy, if it does, or any kind of form of that? And then how do you see, white supremacy manifest itself within the field of computer science and computer vision? 

 

Timnit Gebru: I think the field of computer science is just, there’s so many ways in which white supremacy is manifested in this field. Where do I start? For example, when you look at the people involved in this field, and their history and what kind of history do they believe? Well, they believe that Europe is, you know, a great western civilization. If I talk about France or its colonial histories and how European countries are still colonizing Africa, that’s not a thing you talk about, you know. So just starting from the twisted history, like the gaslighting. So then the paradigm of what they operate in is based on that, right? And so, let’s say if you’re talking about face recognition, or something like that, and someone says, “Oh, well, if a bad actor has it.” Well, who’s the bad actor? So those little things, but just the academic paradigm itself, you know, the biggest way in which I see this is, and this is not just computer science, this is just, in my opinion, all of science and how it’s taught is that there is there’s politics, and there is science, and they’re set up completely separate. And I remember giving a talk one time, and this professor told me, I was leaving in and out of science and activism, when I gave a talk. He was like, part of it is you’re talking about how you think the world should be. And I’m like, well, you know, scientists are always doing that.  

 

Then there is something like attractiveness prediction, that’s a task that people use in computer vision. Who’s considered an attractive risk, should this even exist, like, attractiveness prediction? Just the way in which I mean, I don’t even know how to explain that, white supremacy permeates everything from who it is that’s working in this, there’s literally very few black people whose history is considered just history, right? Just all of these practices, whether it’s like, the datasets that are used, or just the act of classifying. Someone was telling me about this book called Sorting Things Out and just  that an underlying assumption around whose positionality is centered. And also just the assumption that you know things are objective, like, there’s no subject and object is just mathematical. You can just abstract things out from the real world and  just you focus on your math, and then other people, just, you know, can deal with the real world. For example, my friend, Ramon Vilarino, Black Brazilian scholar, wrote this blog post, which was asking, “Is science racialized?” And that question obviously, yes, but like, they [majority of the field] assume it’s not, right. You can’t use these kinds of words, in my field, and be considered a scientist, you know, you’ll be relegated to a term that they use, which is activist. 

 

There’s just so much I can say about this. But yeah, um, in terms of my work, I haven’t really thought very much about, like, the abolitionist sprint, you know, in terms of my own work in, but I think for me, like a lot of stuff right now just has been happening, unexpectedly, like without planning on the end on things, right. So for example, starting to work on face related stuff, it really happened by accident as soon as I met Joy and I wanted to advise her on her thesis. And I could not have predicted where that was going to take her. And now it’s like, you know, people are passing laws, we’re talking about police. And you know, it’s just, I never imagined that that’s where this work would go. Um, and so that’s what’s happened. I started just emailing people from every Black person I saw in this field and said “hi” to, I kind of created a mailing list from there. So I guess that’s what I mean I just, you know, I’m like, chugging along, and then my work goes in this direction, you know? And so I don’t really know, like, if there’s like an overall framework that I’m following, but I’m learning and my thinking is always changing.  

 

Kim M Reynolds: What is something that’s bringing you you know, hope and vision?  

 

Timnit Gebru: So I love dancing, but that’s one of the things that’s sad right now, is that I think social dancing is gonna be impossible for sometime. I used to go dancing for a long time. Salsa. Bacheta. All of that. I’m sad because I think it is the last thing I see coming back because of the pandemic, but it’s literally I think the thing that gives me the most amount of joy and it’s, it’s my escape. I love dancing so much. But right now what’s really cool is that a lot of DJs are having this thing called CoBeat Parties, DJs from around the world are playing music on Facebook Live, so that’s great. I tune in and listen to the music, and remember that I love dancing.  

 

You know, what gives me hope is like when I was at that Black in AI social it was three hours with like 50 people or so and we were all feeling kind of similar things and was like a safe space. And I just had that space for the first time after, you know, all the stuff that’s been happening [uprisings, pandemic, etc], and that felt really good. And people, I’m working with give me hope as well, there’s a student I’m working with on a computer vision project that is mapping spacital apartheid, and she grew up in a township, so just seeing the next generation. But at the same time, you know, when I was reading about the 60s, it tells me that the exact same thing was happening then. So where I live, I’m in the Bay Area, and at that time, they were talking about, you know, fighting the police. And then I think about that, and I’m like, it makes me so sad, because 15 years ago, you know, and they made so much progress back then. And now, 50 years later, we’re talking about stuff like this if it didn’t happen…. I don’t know, but yeah I think those are things that give me hope. 

0