Blog

Kenia Alonzo & Edgar Cruz of Generation Justice: Oral Histories of Surveillance

We started teasing out all the levels of the way we actually criminalize young people by having surveillance. We started thinking about who’s in charge of this and who’s really pushing this through, and it was both acutely. Well, it was acutely APS, but also broadly that there was nobody, no one person making sure that this was going well. We were just super afraid that there was no oversight of this.
—Edgar Cruz

 

Online people constantly make memes of, “Oh, here’s my FBI agent watching me on my camera.” And I’m like, “That’s not normal!” Yeah, it can be funny sometimes but having to constantly be aware and think that, “Oh, hey. Someone’s watching me right now,” isn’t normal. The thing that I hear the most is, “I don’t really care because I have nothing to hide,” and it’s not really a matter of if you have something to hide. It’s more like, “Are you really okay with constantly being watched even if you aren’t doing anything wrong?” Like, it’s just not right.
—Kenia Alonzo

 

Our Data Bodies (ODB) is excited to present a series of oral histories from organizers, writers, and cultural workers whose work has been steeped in resisting discriminatory surveillance technology and racialized surveillance capitalism, and has illuminated strategies for abolition and abolitionist reform. This oral history features Kenia Alonzo and Edgar Cruz, who reflect on their organizing work with Generation Justice in New Mexico. The transcript, which has been slightly edited for clarity, follows below.

 


 

Lexi Spencer-Notabartolo: Let’s get a sense of where you’re coming from. How would you describe the work that you’ve done?

 

Edgar Cruz: Kenia, do you want to go in first and have me jump in?

 

Kenia Alonzo: Yeah, sure. I joined Generation Justice when I was about 13 or 14 because someone sent me a link to Generation Justice being like, “Hey, we’re accepting young people who are interested in activism who are here in Albuquerque,” and of course I wanted to do that. So, I joined when I was quite young. Then, when I was 15—like, freshly turned 15—I had a chance to go to the Allied Media Conference with [Generation Justice Director] Roberta [Rael] and three other people from GJ. I would say that was my introduction into digital security, mainly because we had the chance to go to workshops and meet these people who were literally national figures teaching digital security. I was this little teenager, going into a room like, “Oh, hey, what’s digital security?” and then learning all that scary stuff. But instead of getting a feeling of like, “Oh my God, I’m being watched all the time and surveillance is so scary,” I was instead given the opportunity and resources to learn how to protect myself and my friends and my family and Generation Justice and our community.

 

So, we met Matt Mitchell and… we had a great opportunity to talk to him. Eventually—a couple months or a year later?—we were able to bring him here to Albuquerque and he was able to give a two day workshop to us at Generation Justice and we learned about, like, 2FA [two-factor authentication] and pretty much just any precautions that we can take to protect our emails, our personal social media accounts, any other social media accounts and—It was just a really cool thing because we brought in other kids my age, kids who were younger, their parents. So, we all had an opportunity to really learn and kind of get thrown into the world of surveillance and digital security, but not without protection first. That was really important and probably one of the coolest things I think we’ve ever done.

 

Either a little bit before that or a little bit after that, we also started this little—not little—this national newsletter called ‘e.Woke’ that I was just kind of helping out with at first writing, like, maybe one story. It was a very humorous type of newsletter, so we could talk about, “Hey, we just found out that they’re trying to make us pay for the Internet,” or “Your Roomba is mapping the outline of your house so they know what ads to send you,” but we were able to have fun with it, instead of being like, “Hey, guys, make sure you get rid of your Roombas and be scared.” That was probably one of my favorite things that I’ve gotten to do and after a while I eventually took it over. So, on top of the work that I was already doing with GJ, I had the opportunity to be in charge of that national newsletter.

 

I was working with GJ until I was 19… or, until 2019 when I was 18. Towards the end of it, e.Woke was pretty much my baby. I enjoyed it. I had a lot of fun doing it. That is probably one of the things I was really consistent with, but throughout my time in GJ we would always be learning about digital security and trying to either have a workshop on it or do a radio production about it. I’m sure Edgar remembers more that I probably forgot, but that’s what came to mind.

 

Edgar Cruz: Thanks, Kenia. Kenia got totally most of it. I think I’ll just slip in my own personal stories and experience through it. So yeah, it was totally that AMC that totally sparked it for the rest of GJ also. So, Kenia came back and I was a… I think I was a media justice intern with GJ at the time, working closely with the other interns and some of the youth and Kenia came back with just this energized insight on what was going on around technology and AI and that super resonated with me. I was doing a lot of that research on my own and just kind of trying to educate myself and what it meant for me and my family and my own community.

 

So, those two merged super well because they caught me up to speed and then we all just went full blast from there. I think some of the earlier thoughts on building some type of structure around moving forward with all this information and it was just, like, “This is so overwhelming.” We spent a lot of time reflecting on why it was overwhelming for us, what this meant for us, [and] what we felt our responsibility was now with this new information. What came out of that and what felt super sure was, “Okay, if we had access to all these resources and we had access to these trainings by super skilled trainers in this structured way, how can we recreate that experience for our community? If we feel totally overwhelmed by this information, how do we make sure that we share how important this information is and share—thoroughly—resources and what people can really do with it?” How do we make sure that our community members don’t get stuck where we got stuck for a little bit of time between being just freaked out about how much… Because I think we were unspooling how much technology we relied on and what that really looked like for us. So we were like, “Okay, how do we move people quickly from that stage of fear and like sclerotic inaction to really know that they’re empowered and that they have agency?” even if that agency is just to say, “No, we won’t use the service, ” and to think critically around this.

 

Soon after we were like, “Okay, we totally need to make this information palatable and put it out in a way that empowers people to know that they have a lot of their own agency in this realm.” So we started putting together mini research projects where we would break down, “Well, what does this mean for New Mexico? What does this mean for this region of the United States? And then what does that mean for the people who live in this region? “

 

We started learning that Hispanic youth, Native American youth are amongst the most policed in our state and so we were like, “Okay, then. Let’s break it down by county and let’s break it down by school district,” so we started looking into some of the legislation around, “Okay, we know some schools are policing differently than others.”

 

… I went to high school in Farmington and in the high school that we were at it was fenced in and we had all sorts of weird designs around surveillance and we were like, “Okay, we have a sense that there’s little legislation around regulating that from county to county, district to district, so what does it look like for APS?” Albuquerque Public Schools, which is the largest district in the state. Once we started looking into it, we learned that there was a levy bond that people were voting on that was a part of a preventative measure that I think was named “preventing violent extremism” or “countering violent extremism.” I think those were the two aspects. Like, in schools and then in the community, something like that.

 

It was over $500 million to implement surveillance tactics in schools and something we’ve learned is—as New Mexico is a super multicultural state, but a super vulnerable state whose experienced dimensions of colonization, time after time—is that we are a fertile state for that type of testing. Also, because of when we became a state and lack of regulation around a lot of things, we were like, “Okay, so we’re immediately concerned for our most vulnerable communities.”

 

That’s generally the scope of GJ around radio and around all of our projects so we immediately were linking that together and, sure enough, found that APS was implementing this surveillance, but was also being monitored by places all over the country. So we were, like, “Okay, people are super interested in how New Mexico is dealing with this. You know, what will happen after this research project is over to all these youth?” Then we started pulling together stories of people who went to a school where they were having, like, microphones in the hallways and people were being outed to their family because of an incident that happened to do with a same sex partner in school or something. We started teasing out all the levels of the way we actually criminalize young people by having surveillance. We started thinking about who’s in charge of this and who’s really pushing this through. And it was both acutely… Well, it was acutely APS, but also broadly that there was nobody—no one person making sure that this was going well. We were just super afraid that there was no oversight of this. So we knew that there was little oversight of this and we were immediately freaking out about, “If we don’t have any control over this and we also like, can’t go to one person—much less the school board—to talk about this, the next step and what we can do immediately is start informing people of what they can do and how to protect themselves.”

 

We started with our staff and started implementing the trainings that Kenia mentioned where we brought in people like Matt Mitchell, who trained us to be micro-trainers to more staff who weren’t able to make it. Then, we worked with their families. Then, we started going toward our more vulnerable communities that we worked with and started doing trainings in the community. A lot of those look like, “What do you think about your password? Why do you think you made the one that you did? What would you do if somebody got a hold of it?” and really planting seeds for people to think critically of what this means for them. Because something we learned was that there’s no blanket understanding of what this might mean for everybody and so we really allowed the tools to go into community in the way that they saw fit. We didn’t go into the community thinking we need to tell them exactly what to do and this is how they should do it. We were like, “If we really want to build a sustainable understanding around surveillance and digital security, we need to give them the tools to understand how they see it in their own lives.” So some of those trainings were the password stuff. A lot of them were locking down different social media accounts, really understanding your digital footprint, and then further on at a more skilled level was how to use PGP [an encryption system] and how to use encryption and two-factor authentication and why it mattered.

 

We started reshaping as we went. We realized that there was this huge dissonance of people feeling like, “Oh, this doesn’t include me,” or “This doesn’t matter to me.” That left us back at square one—often—where we were like, “People don’t think this matters because they’re not feeling something instantly as a violent or maybe even aggressive attack.” So we were like, “We don’t want to scare them.” That still was always the priority. We want to inform and not alarm anybody. So a lot of it was like, “Sure, I know that you might feel like this doesn’t matter to you,” but having the research that we had about that bond in APS, we started incorporating that and letting people know that you as an individual might not be targeted or doing anything bad, but your information and your data is still super valuable.” And then when you think about it as the state, it’s super valuable. It was harder to track [or] to measure how people were feeling from there on.

 

But I know that was just like something we kept going back to the drawing board on… Like, you know, why do people feel this way? And what is some of the stuff out there that makes people feel like they just don’t matter? And a lot of it was, like, how the Chrome browser gives you just an option to “allow” or “don’t allow.” And if you allow you give them permission to all sorts of things, but your only option was to allow or not allow. You were unable to click off certain permissions. So much of that was like, “Why do we think that is? Why would we have such limited options?” That definitely sparked some interesting conversations about, like, maybe sometimes design is meant to confuse us and also meant to make us give permission to everything or else you don’t use the service at all.

 

We really gathered just such a deep understanding of our youth in approaching from that way and thinking about how AI design is structured, who it’s meant to serve, and who’s left out of the conversation in its production. We had radio productions that came from it—Like, Safiya Umoja Noble did a presentation with UNM and so we had that recorded and then we got to ask a few questions about the book [Algorithms of Oppression: How Search Engines Reinforce Racism] specifically. We just like watched it unravel in all sorts of different ways. Then we saw at a larger scale how people were writing about it more and people were having more conversations about it. So, out of just more of that, we were like, “We need to think of more tools and how to get this in the hands of people.”

 

Kenia totally fiercely led this e.Woke newsletter that was nationally published and read by thousands of people who wanted to understand surveillance without getting in the weeds of it. We took our research from the community level to think like, “Other people must be thinking in this way, also.” So, the thought behind that was like, how do we make this information resourceful? How do we move people to action? And how do we just keep informing them to kind of keep their ear to the ground on where things are going? Kenia really did an excellent job at making it digestible for young people by including like, “Are you using Snapchat? How are you using Instagram? Do you know your footprint on that? Does that really disappear?” and really tied in that aspect of like, “I know you think it’s like not a big deal to use this but do you ever think about what it would mean if this was being used in a bad way?” and incorporated a ton of humor and blips of just… I mean, it just was so funny and so fun to watch and to read through.

 

People were super receptive to it. I never really heard anything about people being like, “This is scary. You shouldn’t say this to people.” Because anytime we sensed that something we wrote was alarming, we knew that other people might feel this way, and so how do we immediately turn over and offer a tool or resource for that? It laid out articles, but Kenia would write a synopsis of the article through her lens—the lens that GJ had kind of trained us through—and she would break down what the article purpose was, why it was a reliable source, and the crux of what it meant. Then, at the very bottom, they would offer 7 to 10 different tools, from [encrypted messaging app] Signal to like f.lux to desktop apps that kind of just help you with technology. So, f.lux dims your light at a certain time and we were like, “You also gotta take care of yourself.” There was this level of self-care in it, also, that just felt so multi-dimensional and very thoughtful.

 

Some of the APS research was we worked with a group of law students from the University School of Law who were doing a law clinic and they chose to be assigned to us. We did research around what legislation exists for this and is there anything in the books that would prohibit us from being surveilled in this way? And of course, we learned that we could never have imagined this type of technology so there was, like, nothing in place. Therefore, it was super fertile ground for anything to be possible. We gained a lot of understanding on how to do this research and it was really good direction on what’s possible and what’s not. We also taught them a ton about, like, “This is how you interact with the community in this way.” It was a super mutual learning experience.

 

Lexi Spencer-Notabartolo: I want to hear a little bit about the context, if you’re comfortable sharing. What is the experience typically that someone in your community would have with surveillance?

 

Kenia Alonzo: Like Edgar mentioned earlier, in New Mexico is a very Black/Brown/Indigenous state so I think it’s kind of rare to meet someone who’s like, “Oh, I haven’t been surveilled.” Even just by living in a certain part of town, you’ll have cops patrolling a certain area, cameras everywhere and they’re like, “Oh, it’s just a precaution,” when you’re really concentrating on one space. I recently saw something about—We have a helicopter from the police department here and someone posted an “over time” picture of where the helicopter goes and it immediately went over poor, Brown communities the most and the suburbs maybe once or twice. But I think a lot of young Brown people were experiencing it directly at school. I mean, schools have cameras, schools constantly check in and are like, “Hey, what’s going on?” and just teachers noticing things.

 

I can’t speak from my own personal experience with surveillance. To be completely honest, I grew up in a more… Not wealthier, but a side of the city that has a lot more white students at my school [and] a very low “person of color. population. So at my school, I wasn’t seeing it as much because I don’t think that they saw the need for it. They’re like, “Oh, we don’t have that many ‘troublemakers’ to worry about,” but hearing about it from other people in the community who were like, “Oh, yeah, I was getting interviewed by our school police every once in a while.”

 

Regarding people’s online surveillance experiences, I think everybody’s kind of had at least one hacker scare. Like, “Oh, who’s in my account?” or if you look around now, pretty much everyone has a sticker over their webcam? It’s kind of weird—Everybody’s like… aware. Online people constantly make memes of, “Oh, here’s my FBI agent watching me on my camera,” and I’m like, “That’s not normal!” Yeah, it can be funny sometimes but having to constantly be aware and think that, “Oh, hey. Someone’s watching me right now,” isn’t normal. The thing that I hear the most is, “I don’t really care because I have nothing to hide,” and it’s not really a matter of if you have something to hide. It’s more like, “Are you really okay with constantly being watched even if you aren’t doing anything wrong?” Like, it’s just not right.

 

Edgar Cruz: Yeah, I mean, Kenia did that very thing of like, “People are finding this funny but this feels so wrong. How do we use that same humor to insert a ton of super useful information?” So, a lot of the e.Woke newsletter looked like memes. It looked like, “This is my FBI agent… but also did you know that it’s possible for it to be used in this way?” So, entering that sphere of super young, super funny people and then dropping off a ton of information along the way.

 

Each piece seemed to unravel in a new, different way and into a new, different path almost. When we were doing the APS research, we learned, “Oh, people are having more and closer interactions with their school police officers.” And then, you know, we were like, “Well, okay, wait—Who is hiring these police officers?” and they were like, “Oh, they’re just a random company who has security.” They have no training with young people. They were not even trained for the job to ever serve young people, but they’re working super close to young people. There was like a lot of, “Wait, how did that even happen?” and learning that that was a thing. You don’t need to be trained to work with young people to be in a school.

 

We found some interesting research … about metal detectors making schools actually less safe. That helped us dive into the perception of safety and how people understand their own safety and that if we indoctrinate young people into believing that very thing… We really thought about, “Why do people feel like they have nothing to hide? What is it that we’ve learned that makes us feel like our information online isn’t useful to somebody else?” So, some of that was like, “Well, how do young people perceive safety?” and then, “How do we normalize unsafe practices for young people?” and that came out of a study that showed that metal detectors and police or surveillance cameras actually made people less safe and that if you are in front of a camera, you will behave differently and not often the best way.

 

From there, in the bucket of who is impacted the most, we learned that it was Black and Brown students and so we were like, “Okay, well, who is deciding that?” and then we stumbled upon a ton of research of people’s perception of danger and that most people in positions of power in school boards had biases against Black and Brown students. We learned that young, Black girls are amongst the most punished students—they are punished and reprimanded, like, six times more than their counterparts—and young, Black boys are punished severely disproportionately to their peers. We were like, “That, coupled with this information—How is this not targeting just Black and Brown students?” Without regulation it will just keep procreating this whole generation of young people who are used to being surveilled a ton and will keep on getting targeted as they grow up.

 

We learned also that by having surveillance cameras and trying to learn everything about young people, the slightest thing became an infraction. The slightest thing sent you to the police officer or to the principal’s office. Part of that spun off into researching what it meant for young people to have close proximity to the criminal justice system at a younger age and what that does when they’ve had some type of ticket or something at a young age when they kind of get into some real trouble later and how that feeds into the prison industrial complex and that pipeline.

 

Kenia Alonzo: I would say there’s also just kind of, like, desensitization around constantly being watched. I mentioned the meme about like, “LOL, my FBI agent watching me do my makeup,” or something and it’s like, I remember people were outraged when Edward Snowden revealed that the NSA is always watching us. People were like, “Oh, my God!” and now it’s like, the Chili’s app asks if it can track you—always—and you’re like, “Uh, yeah, sure.” I think it’s super important to remind people that it’s not normal to constantly have to be thinking, “Oh my God, is my social media gonna reveal everything about me? Who’s looking at me?” You shouldn’t have to think, “Oh my God, who’s watching me?” from the moment you wake up to the moment that you go to bed. Like, I’ll text my mom or something and I’ll think, “Is someone watching me through my camera right now? Because I’m at a really weird angle so it’s gonna look really bad.” It’s weird. The only way I can describe it is we’re worried about the wrong thing sometimes.

 

Lexi Spencer-Notabartolo: I’m really curious to know about some of the ways that you engage the process of all of the campaign work that you undertook during this period of time. I feel like in my interviews in this project, a lot of what people really struggle with is where do you start? Or how do you make this sustainable? Or how do you decide what you’re going to talk about at these trainings? You talked about centering the community knowledge and what people were encountering, but what did that actually look like? How are you talking to people? How are you getting that community input?

 

Edgar Cruz: One of the earlier trainings was about just how to protect your password … and from that one, we really were able to gauge and get a good sense of where people were. That’s when we started reshaping and reworking because we were like, “Oh, people really don’t think it matters,” and then they don’t care that much and the people right now who care think that also the sky is falling. They’re at the other end of the spectrum. So, we were like, “Okay, how do we get to the center of that broad spectrum but also how do we identify those people within vulnerable communities?” thinking about immigrant communities.

 

We started slowly gathering—with Kenia curating—articles that showed, “Oh, ICE used this information,” or pulling up contracts between ICE and Apple or Google and we would list every now and then all the contracts that we can find that were public between, like, Palantir and the government or Palantir and ICE. We would really try not to spoon feed too much because we were like, “We don’t have that type of knowledge, we can’t successfully do that,” so if we can just get people to be curious about it, we’re gonna keep on providing more resources for whenever they want to come back to it. That’s what our efforts were online: We’ll put out pretty provocative articles about like, “Oh my God, ICE has been tracking people all along… But we will hold you through it because we will provide so many tools and resources for you to come back to whenever you’re curious again.”

 

In the trainings, we really just opened up conversation about like, “What have you thought about your password? What would you think if your partner took a hold of it and misused it?” That sometimes spread a lot of really good conversation, because—unfortunately—a lot of people have that experience where their partner or somebody they don’t trust, or trusted at a time, got a hold of their information. Right around that time, I don’t remember what university on the East Coast had a bunch of people hack their professor payroll and they were funneling a bunch of money—like two cents from each professor—into accounts and ultimately it was a ton of money. They were some of the earlier people to implement two factor authentication and we’d use that as a case study of like, “So imagine what this slight hack would look like for you.” So much of it was just really engaging in that conversation and thinking out loud with people in the trainings of like, “Have you ever thought of this going this way?” or, “Who would benefit from knowing where you shop every Christmas and where you eat every season? Or where you travel to?” or, like Kenia said, “What is the shape? What are the dimensions of your home?” and really forcing people to go up against that. We brushed up against a lot of recalcitrant people who were like, “No, that’s not a thing and you’re kind of crazy,” and we’re like, “Sure, that may not be exactly what’s going on, but we should understand that this work has laid it out so that that type of crazy surveillance is possible.

 

We were really careful about setting up situations that weren’t super alarming but were really trying to tease out the possibilities of just how much people were putting out on the Internet about themselves and really telling people like, “You put out a lot of information without trying to,” and a lot of it was the mindfulness of, “Do you have to approve this? Do you have to post this or have your location on?” I remember us doing trainings internally about when you take pictures, understand that there’s metadata attached to photos, understand that you can be tracked without posting exactly where you’re at. Facebook has like hidden location services that you just have to turn off. So does Google Maps. That was a lot of the effort internally, to really make sure that we’re keeping our young people safe. Then we were able to be like, “Okay, so now how does the rest of the community understand that Google Maps is searching them all the time?” or what apps they’re going to on their phone. That’s the kind of conversations that would come up.

 

Kenia Alonzo: I remember a little while after the Parkland high school shootings, I began working with a local organization named Fight for Our Lives and we were planning a very, very big event—Probably one of the biggest events that the city has ever had. It was a protest—a march—and it was kind of scary because it was all high schoolers. I was getting ready to graduate my senior year, but there were freshmen and sophomores at these meetings, and I was like, “These are children. We need to make sure that we’re being safe.” I saw an opportunity to do a little workshop for them and that was probably one of the best workshops that I’ve done. Like, it was pretty short and sweet but they were all very interested and invested in learning, “Hey, what’s going on?” One of the things that’s coming to mind that I showed them that was super important was using the app Signal for their messaging. We were discussing very private things. We were talking about our personal fears because of school shootings. We were talking about, “Here’s where we’re gonna meet. Here’s what’s gonna go down,” if something happens. We really couldn’t risk someone seeing that. So, I showed them how to use Signal and honestly, if you meet any activists here in Albuquerque everybody is like, “Oh, yeah. Just Signal me,” and I love that. I love Signal. I think it’s a great app. I showed them that and a lot of us have iPhones and something that they weren’t aware of is that iPhones pretty much track every single place you go and they store that information. I showed them how to turn that off and when I showed them they were like, “Oh my God, it knows I went to this restaurant at this time. It knows my address.” I was just very glad to have these resources taught to me from GJ and then to be able to pass it on to my peers. I just love saying that I was able to share these resources and share the information that I learned because of GJ with people who really needed it.

 

Lexi Spencer-Notabartolo: What would you say was the outcome that you felt like you were working towards? Was it specific? Or was it just general awareness?

 

Edgar Cruz: I think it was like, because we knew it was such a scary experience that we were like, “We know this has potential for somebody to misuse it against us,” like, the fear itself. So, I think the biggest purpose that we had noted as a goal was just to inform people, to engage them to think critically about this—the role of technology and surveillance in their own life—and to give them the tools if and when they were ready to take action. But so much of it was really just sticking with them through the tough conversations and kind of giving them some of the language to think critically about what digital security could do. It was a huge educational effort to inform people, just laying it all out for them that this information will be here when you’re ready, but this is what’s possible.

 

Kenia Alonzo: I feel that at GJ, we never really tried to host a workshop to purposely scare anyone or be like, “Hey, I need you to delete like your Facebook, your Instagram, disconnect your email from everything, you’re going off the grid,” type deal. It was always like, “Hey, we’re just letting you know that this is happening and there’s really no need to feel super terrified but it’s good to be kind of cautious.” Like, “Hey, if you’re planning a protest make sure you’re using Signal and not just texting from the Messenger app or something like that.” That’s something that I was always very proud of is we never wanted to scare anyone. Our intention was always to make information and resources accessible.

 

Lexi Spencer-Notabartolo: I’m curious to know like because you started your engagement with this at a relatively young age, were there people in your life that inspired you in terms of engagement and activism? If it’s not specifically about this than about some other kind?

 

Kenia Alonzo: I’m gonna be completely honest: No. [laughs] Okay, pre-GJ, no. I am a child of the Internet, and I grew up with unmonitored Internet access. So I was reading about police brutality when I was like, 12 or 13 and I was like, “Hey, this is wrong,” and I was learning about racism. I’m not saying that my family or my teachers or my friends are bad people who were teaching me bad things, but they also weren’t teaching me about anti-racism or about sexism, ageism, or classism or anything like that. I kind of took it into my own hands to be like, “I need to learn about this because I feel really bad when I see these things happen on the news so what can I do?” So, when I joined GJ I was like, “Oh, thank God. There’s people out there who really care about this, too, and they want to teach me and they want to teach other kids in the community about why this is important.”

 

When it came to digital security, I really did scare my parents and my sister and my friends with it. [laughs] Not intentionally, just because they were very unaware. Like, let’s say 2014 or 2015, my Mom was using Facebook but she wasn’t thinking like, “Hey, it’s taking all this information. It has my face everywhere on it.” I think joining at a young age and learning about all this was actually—It was a lot at the time but now that my brain is more developed, I’m very thankful that that got put into my mind at an earlier age. I’m very thankful and happy that I was able to learn about it because now I just try my best to be a little more careful when I’m doing something or when I’m trying to help other people with something. Every time I talk about it I’m like, “Oh my God, I was a baby when I was getting into this.” I was 17, 18 when I started writing e.Woke. Honestly, it’s quite a flex to be like, “Yeah, I was curating a national digital security newsletter and I was really funny and people loved it.” It’s something that I’m very proud of and something that I’m very thankful that I had the opportunity to do.

 

Let’s say after I joined GJ, the people I was looking up to the most were the mentors that I had because I was learning something, like, every day. Media literacy, especially digital security—Things that I would have never learned about if I wasn’t interested in activism.

 

Edgar Cruz: I mean, I wasn’t that young. [laughs] No, I was still super young. I think Kenia is totally on it that it provided a foundation of understanding. Generation Justice as a project is a media literacy building project and this just went hand in hand with that. That we were like, “We need to think critically about what has made us believe that this didn’t matter before,” but also how to catch on to certain things that feel like they’re not right that likely are and really break down who’s benefiting from this and who’s at the brunt of it. It provided a super strong foundation that I still feel everyday now of thinking critically and also having this built up digital literacy of like, “Somebody’s gonna really suffer from this, even if it’s not me and we need to figure out who it’s gonna be and how to protect them.”

 

Lexi Spencer-Notabartolo: How do you feel like your exposure to this and your experience with this has either shaped or altered your trajectory at the point that you’re at now in life?

 

Edgar Cruz: I think part of this foundation is just mind blowing to me because, as Kenia mentioned with trainings around the protests, the way we were able to adapt as a team around a ton of the events that just happened to go on during that time, because 2016… 2016, 2017, 2018 was an uprising moment for the nation. We were like, “We need to be mindful of the Stingrays [cell phone surveillance devices] that are popping up around the university. They’re close enough that though they can’t be on campus, they’re close enough that they’re in the range that they’re still collecting a ton of data from everyone on campus but also anyone around.”

 

That as a concept for me builds the understanding that it’s not right that if your excuse to, say, not be racist or not to be targeting people therefore you just gather information from everyone—That’s not right either. Of course it’d be bad if they were only targeting Brown people, but their way is, “Oh, it’s okay then. We’ll just target everybody to be fair,” and I’m like, ”Something about that doesn’t feel equitable.” That really planted that concept of there’s got to be a better way and we never really should have just those two choices.

 

…A lot of the work that I do now is communications work for a broader audience and just the misinformation and weaponization of misinformation that we’ve seen in the last—I mean, the last administration but in the last four years, really—has been just astounding. I could have never imagined having this insight that would benefit this now. I think GJ as an organization laid out the groundwork to make fighting disinformation not only possible, but it made them thought leaders on it. We all came out of that really understanding disinformation because part of educating people about this so that they weren’t afraid is to debunk a lot of… Like, there were sections every now and then in e.Woke debunking things like, “Oh, there’s this article that’s saying this is what’s going on, but that’s not true. It might be dissuading you from looking this way by telling you to look this way, but we should really think critically about that.” It made this whole era of disinformation, so much more—We were just able to break it down differently because we had this foundational understanding that data can totally be misused and most often it is by somebody. The biggest piece that I gained is the way it prepared me to fight disinformation now.

 

Kenia Alonzo: I would say pretty much the same thing. I had a lot of training and I learned a lot about media literacy, so I feel very lucky to say that I feel like I know my sources when I’m trying to research something. I have a lot of family members who do like to share a lot of misinformation. In texts they’ll be like, “Hey, like, don’t get this vaccine. It’s gonna kill you and it’s gonna make your hair fall out.” I’m kind of like, “How many times do I have to keep explaining? Guys, like, please.” I feel very lucky to know how to properly find my sources and when I want to find something out I know, “Hey, maybe stay away from this new source—They’re really biased,” or, “If you want someone really neutral, go here.” I would say also something that I took from learning about digital security is—I was in high school and a lot of high schoolers like social media. I was always very careful to not post anything that I was like, “Oh, my school will never see it,” but, like, they’ll see it, you know? I think there’s been cases of schools finding social media accounts—which is a form of surveillance, so why are you spying on your students?—and get them in trouble for something.

 

I’d say right now, I kind of did a 180 from being like, “Yeah, technology! I love typing on computers all day and being inside!” I did a full 180 because I’m going to school for natural resources and I’m taking a lot of environmental classes. I kind of transitioned from, “Oh, yeah, I’m gonna sit at the computer all day since I really need to know all this,” to, “I know a lot about digital security, but I’m around trees all day.” Personally, it doesn’t feel like it applies to me as much anymore but when I come home and I do sit in front of a computer, I know what safety precautions I should be taking. Even if I see a news article that I think applies to someone, I’ll share it with them and be like, “Here’s how to protect yourself because I know you’re on a computer all day, and I know you’re using Zoom. You’re using all these apps that came out. You’re using Tik Tok and it’s, like, why is the algorithm on that so good? Why did they show me something that I was thinking about earlier today?” Just little things. It’s not really like I’m scared of everything at this age, it’s more of like I’m questioning a lot of things because it’s so easy nowadays to get tracked and to get doxxed and everything, so I just want to make sure that people are being safe, like, that my little nieces and nephews are being safe. What’s great is that I’ve been given the resources and the knowledge and I can share that with others.

 

Lexi Spencer-Notabartolo: Is there anything that you wish people would ask you about this work and about the kind of organizing and activism you engaged in that you feel like you don’t get to talk about when you talk about this?

 

Edgar Cruz: What I learned from talking to people is to jump quickly to, “I know you don’t think this matters to you, I know this is not a thing you feel you have a role in.” In talking about it, I know that the end goal is to help people usher out of the thought that is oppressive and telling you, “Oh, you don’t need to think about this because it’s technical language and you probably—Like, there’s nothing you’re hiding, right? So it’s fine.” It’s informed to me to already know that they might be thinking that way and to structure it in an educational way that doesn’t make them defensive to go there right away but empowers them to know what those resources are. Echoing what Kenya said, I don’t feel scared. I more so feel aware. I think this type of research and getting it in the hands of community meant having a quick turnaround between learning something scary and really soon after learning what you can do about it… and how much you actually can’t do, but really how much you can do and really what’s available. I think it usually it takes longer to get there when talking about this.

 

Kenia Alonzo: I would love to be asked, “What else can I do?” Because I feel like there’s only so much you can do online. Yeah, you can change your password, you can make sure that you have, like, 18 different passwords, put a sticker on your webcam, but it’s like—Why should we be doing all of that when we can start doing work that would directly change that? What if instead of having to worry that our webcams are always covered, we make sure that these companies or government can’t even be watching us in the first place? I want to be able to take digital security, take what we’re doing online, and apply it to real life. We’re doing the most that we can to make sure that we’re safe online, but what if we start not having to worry about being safe online because we know that there’s laws in place to make sure that we don’t have to worry about being watched to our webcams when we’re getting ready or anything like that. It’s great that we’re taking so many precautions and we’re teaching each other how to be safe, but I would love to see a time when we don’t have to worry as much. I would just love surveillance to not be as extreme or not have it be a super big concern anymore by getting rid of it.

 

Resources from Generation Justice:

https://generationjustice.org/page/1/?s=e.woke 

https://generationjustice.org/blog_post/e-woke-57-the-creature-from-the-dark-web/

 

3