Marika Pfefferkorn: Oral Histories of Surveillance

They have contracts with folks that are monitoring social media content and we know that has actually been going on for some time now. Prior to that, it was not technology. It was just individuals. School resource officers that were surveilling our students and their families in these spaces.
—Marika Pfefferkorn


Our Data Bodies (ODB) is excited to present a series of oral histories from organizers, writers, and cultural workers whose work has been steeped in resisting discriminatory surveillance technology and racialized surveillance capitalism, and has illuminated strategies for abolition and abolitionist reform. This oral history features Marika Pfefferkorn, co-founder of the Coalition to Stop the Cradle to Prison Algorithm and long time educator and activist in Minnesota. The transcript, which has been slightly edited for clarity, follows below.



Lexi Spencer-Notabartolo: How would you describe the work that you do?  


Marika Pfefferkorn: A lot of the work that I have been focused on over the last 20 years has been education justice, whether it’s Black male student achievement, college and career readiness, to discipline disparities—all to make sure that our young people are getting an education that they deserve. 


Lexi Spencer-Notabartolo: Can you tell me about your first experiences with surveillance in your community? 


Marika Pfefferkorn: Right now, what we’re dealing with is during COVID we have our school districts partnering with folks like Gaggle, where they’re actually surveilling the students’ email addresses. They have contracts with folks that are monitoring social media content and we know that has actually been going on for some time now. Prior to that, it was not technology. It was just individuals—school resource officers—that were surveilling our students and their families in these spaces. That’s where I came to enter and see that surveillance was happening.  


Lexi Spencer-Notabartolo: When did you first become aware of surveillance, generally?  


Marika Pfefferkorn: So, the Twin Cities Innovation Alliance—we are a technology entrepreneurship body and so that has always been in our purview. We may not have called it surveillance but we’ve looked a lot at ‘smart cities’ and how they’re using that for surveillance, so my awareness has definitely grown over the last 5, 6 years. But my point of entry with surveillance, on a personal level, is in this education justice movement. Like surveillance on Facebook; we know that was happening the whole time and it continues to happen, I’m aware of that, but it became much more personal to me when I recognized the implications it had and that the systems that are supposed to be supporting young people are actually using tools of surveillance against our students.  


Lexi Spencer-Notabartolo: Was there a specific moment or incident that moved you to start organizing against this?  


Marika Pfefferkorn: Yes. In 2018, Ramsey County, the city of St. Paul, and St. Paul Public Schools entered into a joint powers agreement that would allow them to create a new governing body that would collect and hold data on everybody involved in the systems. And so when you think about Ramsey County, you have child protection services, you have the county prosecutor’s office, you have the sheriff, so you have all these different criminalizing bodies that exist within this system. St. Paul, you have the police department, housing information, and then St. Paul Public Schools, in which you collect all kinds of different data. And so now all of these entities—for what they said was for efficiency and effectiveness in delivery of services and programs—were going to collaborate and share all this data, and immediately.


I knew something was up. I didn’t know what, because this was not my purview. So I just really wanted to find out more about what was going on, and so what really drew me into the conversation is, in an effort to be more efficient and more effective in delivering services, what they decided they wanted to do was focus on upstream and so they were going to use an early warning system through predictive analytics to identify a risk factor for a student. And when I learned about the indicators that they were going to use, I was like, “Are they crazy? They’re using perception data?” And, in particular in my field of discipline, they were going to use suspensions, and specifically in that year in Minnesota, the Department of Human Rights identified 43 school districts for discriminating in out-of-school suspensions. So if you’re going to be using that data to identify who’s more likely to be involved with the criminal justice system, I was like, “Well, there’s no way that we can win. So yes, you’re going to be collecting all this data, but what are you actually going to do with it? Like, what is the problem that you’re trying to solve here?”


And we recognized that they had come across all these indicators that we really felt were deficit-based, but they didn’t actually have a solution. So they were going to surveil our students by collecting all this data but when we asked them, “What are you going to do with it once you’ve identified these students as being at-risk?” And their answer was very similar to what they’re doing right now, which is, “We’ll have the students that are flagged and we’ll send a social worker out to the house and say, ‘Here, we’ve got a list of resources that will help prevent your child from being in the juvenile justice system.’” I neglected to mention that in addition to using suspensions as an indicator of who was at risk, they were also going to be using attendance. They were also going to be using “Were your siblings or your parents ever involved with the juvenile justice system?”


So all of these factors were at play. We really felt like this is a deficit-based approach if the system doesn’t have any responsibility for failing our students and families and then they did not really think through the process to what is the solution that will actually shift the paradigm of the school-to-prison pipeline. So that really was the eye opener, not just for me, but for parents and community members that had participated in a community engagement process that community members thought was going to lead to better upstream solutions. In those conversations, community members came up with recommendations on what really should be invested in and supported, and they took one of those suggestions—which was “share data,” but it meant within a system. And they translated that and said, “The community wants cross-system data sharing.” So in addition to suggesting this deficit-based approach of surveilling students, they misrepresented the community and said what they actually wanted as an outcome to improve the outcomes for particularly Black and brown students! And so not only was it a flawed process in their selection, but they really, really did not understand the lack of trust that existed between the community and the system and how this further separated the line between our ability to work towards a solution.


So that was my serious point of entry; when it impacted directly what I had been trying to work on for the last 10 years where we’re going to collect data about suspensions so we can hold people accountable and now you’re going to use that data that we’ve been trying to collect against us. What I think it reflects is patterns and bias in teachers not actual behaviors in students. The lens through which they were looking at it this information was just so biased and deficit-based, so we organized as a community in response.  


Lexi Spencer-Notabartolo: Was the buy in from the community instant?  


Marika Pfefferkorn: There’s a lot going on here already, so us raising the flag like, “This is something that’s happening,” folks were like, “Yeah, yeah, and so is this and this and this.” And so we did have to do a big learning curve and we had to actually identify each of the individuals’ point of entry where this mattered most to them. And so we found that it was in a really small, minute way. As they were talking about preparing to use this data collection process, parents were like, “Okay…” so there was a community meeting and they said, “If we decide to opt out of this, will we still be able to get our iPad?” and the school district did not know an answer to that. And the fact that that small piece—that iPad—it could impact the experience of their student. That’s where we found the point of entry for parents and they were like, “Oh, no. This is not okay.” And so from that entry point, we started educating them more about the other ways that this really felt like an inadequate response to what we were asking for. Once we pigeonholed that point of entry, we started holding summits, house parties, convenings, community conversations to really unpack what was happening.


And so what we had to do was really go about and find the point of entry for all the folks we were trying to engage in the conversation. For example, our tribal communities had not been engaged in this conversation at all. They had been invited to the community engagement conversation and what they said in that conversation is, “We’re concerned that our information that we share here is going to be misrepresented.” So they already had issues of trust with them and when they found out about St. Paul Public Schools unanimously approving this, they said, “Well, why didn’t they check in with Indian education because it’s state law that we have tribal sovereignty in the pursuit of education here? So what does it mean for us, as a sovereign people, to have them have the ability to collect our data?” And so that was one point of entry where we found there was a great deal of pushback and a lot of our native folks here started organizing in their own respective communities.  


Another point of entry was our young people. We had a community forum, and we were talking to the young people and they’re like, “Yeah, yeah. There’s a lot of stuff going on.” I wasn’t quite sure if we were getting the message across but, actually, following that meeting I got a phone call from one of the students and she was like, “Oh my God, I was sitting in class and I started looking around and thinking about this point system, this risk system that would identify me or some of my other classmates as at-risk.” And she said she started looking around and said, “Oh, I’d give them a five and I’d give them a three and they’re always up in the front…” And she said, “If I had been asked or to think about this last year, I was in a very different situation. We were experiencing housing instability.” Her Mom had lost her job and so they were staying on couches of aunties and friends and family for a while and academically, things were tough, obviously. But she said, “If they took that moment,”—she didn’t use that language—“if they took that snapshot of who I am and who I am today, I would be at high risk for being involved in the juvenile justice system.” And she said to me in these words, “My past does not predict my future or my potential or the opportunities that are in front of me and that is why I’m upset about them proposing to make these decisions about me.” And so that was another point of entry, just looking around and she’s like, “We’re already labeled. And I’m labeled, but I’m in a new space and they don’t know about how I was struggling last year and I like that they can see me as this whole person now and they treat me like I have a value and that I’m there to get my education.”  


So the list kept growing where we kept finding these points of entry whether it was through housing, whether it was through domestic abuse, all these folks were like, “They have access to our data? And they’re going to be making decisions on it? And it’s not even people that we know, it’s going to be machines? It’s going to be an algorithm?” “What is an algorithm again?” And so we just kept having these conversations. Part of it was really about interrupting the intimidation many felt about using often overly complex language that doesn’t need to be so. So when we talk about algorithms, we created an activity called algorithmic improv—a way for them to digest this information, play it out, become confident, build their muscle about using language, or being comfortable even mispronouncing “algorithm” in public and still being okay with it. Because it’s not about the pronunciation, it’s about the understanding and the impact.


So just continuing to have those kinds of conversations with youth, with elders, with community members, and with staff internally in the district. What we also found is, we did a stakeholder analysis asking who is involved in this process, who’s influencing it? And we found out that the unanimous decision to approve this JPA, most of the folks that signed off on it did not understand what they were signing off on. They did not know what predictive analytics was. So we recognized that there was a learning curve, not just for us, but for those that had signed on, and so although we were not immediately able to get them to dissolve the joint powers agreement, we were able to get them to take a pause so we could all do a deeper dive and ask and answer questions that we needed to understand before we could even move forward—because what they proposed was, “Oh, we’ll just change it a little bit.”


And to us, this process was so flawed, from the community engagement to the fact that for the last five years you’ve been in conversations with these companies and you didn’t bring anyone else in and you thought the community engagement was the only spot we needed to be a part of the conversation. So it really was a reckoning of relationships with community and institutions because originally the institutions are like, “Oh, you always spoil everything. You don’t want to advance, you don’t want innovation.” And we’re like, what is your definition of innovation? Because this is not it. This is you all politically trying to have a silver bullet without a real solution. And what happens to the resources that are already scarce that we now have to invest in these technologies when we still don’t have enough people serving our young people in the schools as it is? And so we kept lifting up these challenges and concerns and continued to have the conversation.  


And one of the things I will say that was really productive to this—because, again, I keep bringing up this legacy of trauma and mistrust between systems and communities—so what we also used, because we knew that our communities of color are really carrying a burden just even having these conversations, is a restorative practice approach to bring community together, so there was also an outlet to let go of some of this harm of always being hyper-surveilled and being used as guinea pigs with new technology.


So we used a restorative practice to convene people to talk about what was happening, we used a restorative process to meet with decision makers and policy makers within the systems, and even after we went through this whole two years of organizing and after we finally dissolved, we still met with the decision-makers to do a restorative repair of harm because we still needed to have a relationship with them going forward because we knew part of the issue is that these companies were providing and saying, “Hey, for the first three years we’ll give this to you free if you let us use this as research and to be able to promote this to other school districts and other counties.” And so it was also a question of the budget. One of the questions—even in 2020 I still can’t get an answer from anyone—is what was the actual budget for this beyond what was provided in the agreement, like, the pilot? So they would design the algorithm, they would do the predictive analytics, but at a certain point, how much was it going to cost to protect and house all this data?


And so we had all these other kinds of questions like beyond these first three years, what is the cost? Where does that money come from? Is it coming from social workers? Is it coming from counselors? Where’s it coming from? And they really hadn’t had a thoughtful process to be able to give us that answer. And so we built the case, like, there’s not a budget for it, this is more of a deficit-based approach to children. This is more of the system not taking responsibility for their ills and their legacy of racism and bias in the institutions. So that was kind of our learning curve—and we’re still on it. In January, we’re launching the No Data About Us Without Us fellowship, and so we’re actually partnering with Promise Neighborhoods throughout the state of Minnesota that have parents and we’re doing an eight month training so that then they can go and ask questions and be the ambassadors in their communities because we know this keeps surfacing. We know this keeps coming up and we don’t have the capacity to be everywhere all the time. So it is really our goal now to tell our story, get the training, get the language, get the information out there, and really build a network of folks that are aware that this is happening and have the ability and the wherewithal to pushback.  


Lexi Spencer-Notabartolo: That’s incredible. I also love the idea of algorithmic improv.  


Marika Pfefferkorn: That’s always the most popular!   


Lexi Spencer-Notabartolo: It’s because it’s such a clear and engaging way to manifest things that live on a computer in real life. 


Marika Pfefferkorn: Yeah, so we did One Mic, where they get to learn about it, and then share with other parents. We did the human algorithm—if we were to build an algorithm, what would success look like? What are the indicators that we would use? And so just different ways to think about it and unpack this really alien approach to education for many of our parents that were still just trying to get their kids to school every day. What was most exciting and when we found out that the other stakeholders—So we sent them, after we had a summit and after we practiced, they came up with a list of questions that they wanted to interview legislators or the city council or the school board on, and they all came back and they said, “Wow, we feel so much more informed than they do.” So it was really an opportunity for them to build their confidence and now when they talk about it to other parents, they’re like, “Don’t assume that they know what they’re talking about.” And that really mitigates that layer of anxiety for many of our parents.


So we’re going to have them do that again as a part of the fellowship this year. They’ll be interviewing state legislators, like, what are the state policies? That’s one of the things that we surfaced… that they didn’t even follow state law—which is a very vague law—as they were building out this JPA, because in Minnesota, after 10 years you have to come up with a plan of what you’re going to do with the data that you’re going to collect. Well, they had no prohibition whatsoever. It was just going to be this expansive data collection and they had no plan in place. That’s a minimal law, but they didn’t even do that. So it’s also our goal, as we’re moving forward, to think about what are the policies that need to be in place at the district level, at the city level, and at the state level because as a result of what happened, all the county and the state agencies call me personally and say, “Well, we’re thinking about using this da-da-da as,” and I was like, “Well, I can’t speak for that.”


But they’re all scared because they had invested millions of dollars to get this moving and they don’t want to get that far out, so we’ve been having conversations with them about what that process looks like now with communities. We’ve been looking at community data advisories, we’ve been looking at what does a community impact assessment and community benefits agreement look like with communities and institutions, even though we know a community benefits agreement is not legally binding. What can we get out of that in the commitment from the institutions that might make a difference in the long run? So we’re still trying to figure it out.  


Another piece that we did is, like, we know Minneapolis or St. Paul cannot be unique in this, and so we partnered with the Dignity in Schools campaign. They have about 115 member organizations across the country that are working on the school-to-prison pipeline and we partnered with them and had them do some research in their local communities to figure out if anything like this was showing up in their school district and we actually worked with 25 cities. And then we worked with some interns at the Ida B. Wells Just Data Lab at Princeton and they did some mapping and grading and scoring of how school districts across the country are either talking about their agreements, talking about the protections of, like, FERPA or what kind of data sharing things are in place and so we’ve actually been doing some heat mapping across the country to really see where this is emerging.  


Lexi Spencer-Notabartolo: Were you encountering any support for this? Like the resistance that you encountered, would you describe it as people who were supportive of this data collection or just a lot of ignorance about what it entailed and the implications of what it would bring?  


Marika Pfefferkorn: I think folks were defending something they did not understand, and there’s a political aspect to this. All of the people that were in those positions were Democrats and they all had relationships, so there was a level of trust across these systems to engage in this opportunity that was supposed to be efficient and effective. And so I don’t think they did their due diligence or their critical thinking; I think they got rosy colored glasses when they were approached by the National Council of Crime and Delinquency to do this and I think critical thinking just went out the window. And our community started the Coalition to Stop the Cradle-to-Prison Algorithm. We named it the Cradle-to-Prison Algorithm since they were going to use child protection services and all these other pieces in there, and, you know, nobody has a chance now.


Not everybody wanted to join our coalition, because there are tensions between community groups that are organizing, but what we did find is a way to have conversations with those people and then use this body to really drive the policy. We wrote a policy brief because nobody could even have a concrete conversation about it because nobody understood, so we published a policy brief. For example, the teachers union was like, “Well, we’d be okay with this if da da da da…” And we were like, “Well, that’s not where we’re at right now. We’re all in for getting rid of this and if they want to have this, they gotta start over completely.” I don’t think I mentioned this, but there was actually a data breach in Ramsey County at the time. I think that data breach, as much as our organizing, had as much of an impact as it did on our organizing. So as we’re organizing, members of our community are getting emails saying that their data was breached because they received Medicaid when they were in high school, or any of these things. And so, it was a shocker, like, “Oh, wow. They can go backwards, too, with all the information they had.” They mostly were just thinking about the prosecutor’s office and the police department, but they’re like, “Wow, they have so much of our information,” and then a year later, that same data breach was found to have been even more expansive than they first understood. So that was a leverage point for us to say, “Your systems don’t even have the protections in place to care for what we have right now. How do you even think that we could open and expand the sharing of this information?” 


Lexi Spencer-Notabartolo: What would be the typical day-in-the-life of surveillance at home or in the street—either at work or school—for someone in your community? 


Marika Pfefferkorn: Well, we have a lot of cameras out and about across our community and the number of cameras are growing. I think our Black and brown communities are hyper-surveilled by the police. We think about North Minneapolis—they have the cameras pitched where they know the violence is happening and capturing it, somehow it’s not changing anything about the violence but they’ve got the cameras posted. I think about our transit systems that we have—the light rail and the buses—they’re hyper-surveilled. I just think anywhere a Black or brown body is moving, we’re being surveilled whether it’s CCTV at shops, out in the streets, on the buses, how we move around the city—I think it’s everywhere, and I don’t even understand all of that because, for me, this hasn’t been like a ‘surveillance conversation,’ although I know it falls into that box. I’m still new to surveillance and what that means and so surveillance continues to be the social media monitoring and the email monitoring. It’s interesting, very interesting. So I’m still fleshing that out and what it means for our community here, because surveillance is a scary word for many folks in our community and to position it as surveillance, I think there’s more resistance.  


Lexi Spencer-Notabartolo: Is there a way that you think people would feel more comfortable describing it? 


Marika Pfefferkorn: I think we have yet to land on that. I think we could still have a conversation about Facebook as a tool of surveillance and folks would still disagree and still agree to be on Facebook. I think folks think if it doesn’t appear to do any harm it’s not really surveillance, any immediate harm to them. 


Lexi Spencer-Notabartolo: Do you think that part of the complexity of what you’re dealing with in this case is that you have a school district involved? Like, state and local leaders and levels of government are involved—do you think that made it easier for people to understand the inherent problem because the government or elected officials were doing this as opposed to a company?  


Marika Pfefferkorn: I think there’s so much distrust between our institutions here that a company is less known than our institutions, so I think it just depends on the relationship with people. 


Lexi Spencer-Notabartolo: Who or what would you describe your efforts as organizing against?  


Marika Pfefferkorn: Well, first, organizing is: you all need to better understand what it is we’re talking about; what is the problem that we’re actually solving for? What are the resources that we have available to us? What are we organizing for? I would say that we are organizing for—if something like this, not necessarily the predictive analytics are in place—the organizing is for a co-creation, an equitable ownership stake in the process, from co-creation, to evaluation and assessment; we’ve got to be there the whole time and if at any point in this process we are uncomfortable with it and we don’t want to go forward, we have the right to say no and you will respect that. Because the push back that we kept getting is, “We’re so far into this, we can’t turn around now. We can’t turn around now.” And that’s unacceptable, because you started without us and that’s not our problem—no data about us without us.


And then also organizing for, as we recognize that this is becoming more prevalent—what is the cost? What is—if you proceed with this and something goes wrong, what is the payback to the community that’s impacted, whether it’s individually or collectively? Because you’re draining value out of our data and we’re not actually benefiting from it, so naming what our value is for data is really critical and folks understanding what the value of their data is has been a really deep conversation that we’ve been in. And organizing for an equity stake, that is, decision making and the ownership. Like, I cannot tell you—when I had conversations with elected officials, and they were like, “It’s not our job to educate the public about these terms and these ideas,” and I asked, “Whose is it?” And he didn’t have an answer for it. So my answer was, “It’s us and you will need to be involved in getting that education because you can’t speak to it either.” 


Lexi Spencer-Notabartolo: How often, when you went into those meetings— especially with elected officials—did you feel like you were educating them?  


Marika Pfefferkorhn: (laughs) Well, first of all, it took us forever. We sent letters, we asked for meetings, but we were seen as problem-makers and so they tried to work with more traditional groups like the African American Leadership Forum, who were not completely opposed to it. So they found other bodies to work with but we continued to expand our base and they were not able to just shut us out of the conversation. I still think if you asked any of those elected officials, they would say that the process was wrong, not the product. So they still don’t understand, like, what it was about algorithms and what it was about predictive analytics. They knew that the community was pissed but it was still really hard for them to hear why. And we asked them things like, “Well, if we use a predictive analytics approach to your effectiveness as a legislator or as an elected official, or if we use predictive analytics to say how effective our teachers were or who might be more likely to be biased against our students in the classroom—how well would that go over?” And so we just had to try to keep flipping the script. I don’t think we actually won them over because one of the things they’re doing as a workaround is they’re now trying to pass state law so they don’t have to do a JPA at the county, city and district level. So have they actually learned their lesson? They learned that maybe as an individual, this is not good for us but maybe we can convince all these other people and that we can move it at a state level and it would be more difficult for them to push back on.  


Lexi Spencer-Notabartolo: Would any school district within the state that falls under state purview just be able to implement this, full stop? 


Marika Pfefferkorn: Not the same agreement with the JPA, because they did admit there were flaws, that there was no prohibition, no end date, nothing. But I’m trying to find… So they did one for Hennepin County; they have, like, 36 school districts, including charters, in there that they would be sharing across. They’re looking for ways to combine and have sets and networks where it’d be multiple school districts working with a county. So they continue to try to create it in different ways. I think they are a little more hesitant around the predictive analytics. I see them talking about it in different ways. Like, they still want to do the data sharing and more so than predictive analytics, it’s always coming up as “early warning systems,” and I think that’s also how they got around it with the public officials—when they said it’s early warning systems versus predictive analytics.


In most of those agreements, they always leave the door open for other people to join. Like when they created the new governing body, they were going to put the chief of police, Hennepin County prosecutor’s office, one of the associates Supes and somebody else, and those were the folks that were going to govern over this joint powers agreement and the data. So like, why are you giving the police chief access to this? Why is the sheriff’s office going to have complete access to information they’ve never had before? So I think they’re just trying to create that at the state level now. I don’t know where it lands with the algorithms and the predictive analytics, but they are definitely trying to expand the data sharing more broadly. 


Lexi Spencer-Notabartolo: As part of your organizing efforts, were you ever able to connect with more established groups and interest sharing bodies, like the one that you said the local legislators were more interested in meeting with, rather than you? Were you ever able to coalition build with those other groups?  


Marika Pfefferkorn: So, we have interesting dynamics. We might have been able to connect with individuals within those groups and they worked with us behind the scenes and they signed and sent letters, but some of the leaders of those other groups had different types of relationships. For example, the African American Leadership Council had a consent decree with the police department and so they had different relationships with them and so their approach was very different. The legacy organizations were not as supportive as we would have hoped but we were able to work with individuals. We tried to work with the ACLU but they also had capacity issues. When we were looking for a legal analysis of the joint powers agreement when we were very new to it and didn’t know what all it entailed, we called on our local professors at the law schools and other places. But our coalition at the beginning was made up of about 23 individuals and organizations.


One of the most interesting pieces is Jaylani Hussein, who was the executive director of CAIR, and his point of entry in this conversation was after 9/11 and Countering Violent Extremism happened and he said, “You know, they were doing these predictive models with Muslim students and particularly Somali students in the schools here. This has been in practice for a long time and now y’all start to care because it’s a broader audience that it’s impacting.” And so we had really critical people at the table that had a sense about what was happening or pieces of it. We had the St. Paul Promise Neighborhood—they actually adopted data policies into their issue campaign. We had Inequality, which is a large Native supporting group. We had my organization, the Twin Cities Innovation Alliance. So I would say that the legacy organizations were not as involved in the organizing, but grassroots, small agent organizations were very much front and center. 


Lexi Spencer-Notabartolo: Have those bonds continued?  


Marika Pfefferkorn: Yes, we continue to operate as a coalition. We meet quarterly now and we’re grappling with questions like: what does a community data advisory look like? Is it legitimizing institutions if we give them that? What are the state laws now that we need to think about advocating for because there aren’t groups here that are doing that? So we’re looking at the bigger picture and keeping our eye on the ball. At the same time, we’re in the middle of COVID, so a lot of that has also pivoted to the juvenile justice system and how they’re supporting or collecting information about those students. I mean, it’s still focused on the JPA and that work, but it also has taken on the COVID surveillance landscape as well. 


Lexi Spencer-Notabartolo: Given COVID and the implications COVID has—especially for parents or people who don’t necessarily have flexible working schedules—have you been able to keep your parents engaged? 


Marika Pfefferkorn: Oh, yeah. One of the nice things in our organizing model is that we value the contributions of parents, so we have received funding to give stipends to parents to continue to participate in these meetings or our parents that are doing the No Data About Us Without Us, they also get paid because it’s their time, too. So I think when you show folks that you value their time and contribution—and also it continues to be an issue, not just in the St. Paul schools. So I think our coalition is expanding as COVID has impacted, and as the murder of George Floyd and the subsequent uprisings, I think folks have continued to look for spaces of folks that are poised and actually organizing around these very things. 


Lexi Spencer-Notabartolo: Could you describe what your ambition is for this work?  


Marika Pfefferkorn: I believe that Minnesota, for me, is ground zero and that as we continue to learn and develop tools, our learning can be shared across the country, across the world, and that we learn from other people. So we intend to expand the No Data About Us Without Us fellowship to not just be Minnesota but to partner with other entities. We are looking at Enview by Civic Eagle, a tool that we’re working with where you can track the growth and evolution of policies at the state level and the federal level and so also tracking to see this emerging growth. One of the other things that we’ve been doing a deeper dive on is on those third-party vendors and those partnerships where they offer the first pilot years as free, highlighting that up. It really is about having an infrastructure in place that can actually challenge this because right now, I don’t think there is one go-to, and maybe I’m ignorant, but I don’t think there’s a grassroots go-to place where you can find this information, engage where you’re at, and move from there. So I think really about scaling what we’ve been trying to do here and really—I don’t want to call it a think tank—but like really bring together these parents and these youth and those that were going to be the impacted individuals and really flipping the script about the role that they play in decision making around data, informed decision making.  


Lexi Spencer-Notabartolo: You described Minnesota as ground zero. Why do you think it is ground zero? 


Marika Pfefferkorn: Well, when we were looking for help and looking to see where else this was happening, we called out to a lot of folks and what we heard is that folks have been trying to communicate this for a significant amount of time, but it wasn’t translating well. So I think our story really was an impetus for others to really recognize that there is a space here to do this work. Ground zero… Minnesota thinks of itself as a very progressive place and for something like this, this emerging tech, or the cradle-to-prison algorithm to emerge out of Minnesota was shocking for some people. And they like to talk about a very strong education system—a strong education system for some, because our BIPOC students are not surviving or flourishing in these systems and so I think if you look at the convergence of class, race, education, all these pieces, for us, especially after the murder of George Floyd, it just kind of escalated all these things that have been here and that’s why I think—because we also were able to organize successfully to dissolve this JPA—that’s why I think we’re ground zero. Like, when we were looking for stories of success or models of what to do, we had to kind of figure it out for ourselves as a community.


At the beginning, there was a lot of doubt and naysayers and saying we didn’t know what we were talking about, and one of the things that we did was reach out nationally. So through my Dignity in Schools network, Zakiya Sankara-Jabar who was our field organizer, she connected me with Yeshimabeit Milner from Data for Black Lives and Yeshi came out to St. Paul and she talked with our parents and she’s like, “You’re not crazy. The fact that you’re addressing this, you’re, like, 20 years ahead of other folks.” And so I think there was this invitation or an openness in our community to say, okay, even though our status quo people are pushing back against this, we are not alone and there are folks that have been talking about this at a technical level with data scientists and academics, but what we haven’t seen is that how is that translated into community. And I think this is also a really good example of that kind of ecosystem support—we definitely did a grassroots effort here but had we been told that we were not crazy and given some insights about the playbook, I don’t know where we’d necessarily be. So I also appreciated that outside perspective, and I think that was necessary, too, for our community because, you know, they told us we were crazy [laughs], and that we didn’t know what we were talking about.  


Lexi Spencer-Notabartolo: If you had to pick a song that describes the vision that your communities are working towards, what would it be? 


Marika Pfefferjorn: Ciara has a song called, “Rooted,” and it talks about coming from strong roots, understanding and knowing who you are and that there is a whole ancestry behind you that has your back. “Young girls, get rooted” is like the hook for the song but really, with the restorative practice, we look back to our past—I look back to Ida B. Wells and I look back to all these other folks that were organizing that gave me the strength in the midst of wanting validation and being told you’re crazy to pursue something. So that song for me has been an anthem of like, it’s not just the people in front of us that are working with us right now, it’s our ancestors pushing us forward in this journey. The song reflects the amplification of that cycle of justice in our community. 


There was one piece that I didn’t get to share about, and that is the funding piece around all this. One of the challenges was that it was a foundation that supported the community engagement for that original community engagement that was mistranslated and used as a lever to convince the elected officials to unanimously approve. And so one of the challenges that we have found in foundations is there is a lag in their understanding about the role of data. We wanted to provide childcare, we wanted to be able to provide food and all these things for these parents as we’re organizing them and the thing that we kept coming up against is the program officers at the foundations were like, “Isn’t data great, though? We’ve been told for the last 10 years that data is great.” And so one of the challenges that we found is people’s inability to pivot or challenge status quo systems in that place, so a part of our organizing that I have to credit to our community is we pretty much did it with no resources. Like, they were willing to fund the community engagement but they were not willing to fund the opposition. And even still, we’ve been asking them for resources and support to continue to educate and engage parents and community members and they’re like, “Why would we need to do that?” And so this learning curve is not just for our elected officials, it’s for our funders as they’re thinking about integrative data is the fancy thing right now and cross-system data sharing is a thing and so as a community, what we’re still trying to figure out how to do is catch the philanthropy community up and hold them accountable to the ways that they are doing this work, or perpetuating the harm that is happening to communities. That is a conversation that needs to be elevated and I don’t know how to do it [laughs]. 


Lexi Spencer-Notabartolo: On the opposite side of the foundation/grant funding spectrum, we’re seeing more and more that any position where you would be learning the tools of grant writing is becoming increasingly data-focused. So people who might otherwise be really well suited to that kind of work and be able to access those kinds of funds for whatever causes that they’re engaged with—they’re intimidated out of those roles or building those skill sets because it involves math or like, there’s too many analytics questions. I had a phone call with someone not too long ago who’s really interested in it, this really powerful young community organizer, but she’s just terrified of data. She just didn’t she didn’t get that as part of her education.  


Marika Pfefferkorn: Yeah, and data literacy is fundamental to our education system right now and no one’s having that conversation either. One other side note is that, in that process, we also recognized the role and point of entry of the data scientist, so some of the work that we actually have been doing is working with our local data science and machine learning institutions. Like at Macalester College, we co-teach a class now about community connectedness and data science because a lot of our data scientists are coming out and they don’t even know how to bridge that conversation with community to say, are we even on track? Like, what does this mean for us? And again, that requires equipping the community to have that conversation and it requires data scientists to be open and able to have that conversation.


So we’ve been working with Macalester College, Harvard College, Princeton, Michigan Tech, with all of their data science departments, because those are all folks that heard about the story and what we were trying to do and they’re like, “Okay, how do we navigate this with our students that we’re producing now that still don’t know how to talk in depth with somebody that’s different from them in the community that they’re unfamiliar with?” So, for us, the central point of our organizing strategy has been, ‘What is your point of entry?’ and ‘what is your accountability and responsibility to solving this larger issue?’ So the point of entry for fund development, we still haven’t found their point of entry.  


Lexi Spencer-Notabartolo: Regarding the foundation side of things, you mentioned that part of what the involvement of the foundation did was it impacted the dynamic with the elected officials who were voting on this. Can you explain that a little bit more?  


Marika Pfefferkorn: So, the Bush Foundation was the foundation, and so there was confusion with the elected officials; I told you about the question I kept asking, “What is the budget? Where’s the budget?” And the chair of the county board’s answer was, “Well, the Bush Foundation is going to fund this.” No, they funded the community engagement at $200,000 and now they’re done. But nobody could answer those questions. Was there still a conversation going on with the Bush Foundation? Maybe so, but it was not public. We actually called the Bush Foundation to account and said, “We need resources to address the harm that you’ve actually done in the community.” And they were like, “Well, that’s not innovation.” And yet the community engagement was an innovation. They didn’t take any ownership for it.  


Lexi Spencer-Notabartolo: Have you had continuing conversations with the Bush Foundation specifically, or any kind of potential funders?  


Marika Pfeffkorn: Yes. So one thing that did come out of it is we actually connected with the Data Funders Collaborative and I shared our story with them and they said it changed the conversation they were having as funders. So, in that Data Funders Collaborative, you have Ford Foundation, you have Chan Zuckerberg, you have Annie E. Casey—a lot of these large education tech foundations that are all having conversations, and they’re looking at what their ethical practices are as they think about funding going forward. So they haven’t really done a great deal of funding. They’re in their learning phase. One foundation that showed up right away was Hazen Foundation, and the Communities for Just Schools. Those two foundations support education justice and education justice is a disappearing field of foundation funding, and so what they could give was very little compared to what we could have gotten from those other groups.


So there was some influence with the data funders. The challenges with them is they see this momentum—like, predictive analytics is happening one way or another. Now we just need to put the guard rails on it, talk about it through an equity lens, and we’ll move forward  versus we need to take a step back, re-engage with community, and think about what this looks like going forward. So we haven’t gotten to that strategy but we have shifted the conversation with many of those folks and then the funders—the Communities for Just Schools and the Hazens—are having conversations with them as well and I think they have a little more influence and leverage than we do. So it’s a very slow conversation. We had the Data for Public Good conference on November 5, 6, and 7th and had a special session for funders, and what was striking is we had funders from across the country, but no funders from Minnesota because they just don’t get it, and that is who we were trying to have them hear from—funders that would influence their imagining and they’re just like, “Meh, we don’t really care.” 


Lexi Spencer-Notabartolo: Did you find any unexpected support?  


I would say the institutions of higher learning like Harvard, Michigan Tech. We actually raised money by having me go out and speak to them and that resource would go back to the community to fund the childcare. And so they recognized this as a valid challenge and they were the ones that were like, “We’ve been having these conversations, we’ve been ringing the bell! But people don’t seem to care or understand.” And I was like, “No, they don’t, because with the language you’re using, I’m not even interested in having that conversation!”


Not to put academics down, but if you’re trying to translate that to folks that are directly impacted that don’t have any, like, relationship with technology, no. So the conversations with those institutions really look like, “What does it look like to have community connectedness with our institution, with our leaders with our academics?” And so I think, again, they were another group of folks that told us we weren’t crazy.


And I think three people have written books now that have the JPA story told in it. Michigan Tech just released a book. Dr. Ruha Benjamin—she uses the JPA story in every talk she does now. We shared a user case study with Harvard for their training research and development staff for school districts across the country, because in St. Paul Public Schools, they didn’t listen to the Director of Research and Evaluation. She said, “This is not a good idea,” so she was shut out of the conversation. People that did not agree were shut out of the conversation internally. And then there’s the other piece of the folks that knew what was wrong, but like, “I got kids in college, and I can’t lose my job right now. So I know it’s bad, but we’re just going to try to find the best path we can.” 


With the academics, I also started getting invitations from, like, the Center for Democracy and Technology and Brookings Institute, and a part of that is—because I’m a pretty local person—I don’t actually know the stance or positions of these organizations and what their relationship is with technology. So, in addition to the language, part of my learning journey is who are the players? Who are the influencers beyond my purview? And I’m still figuring out, “Oh, if they’re inviting me, does that mean, they believe in what I say? Or… What’s actually behind this invitation?” So I’m still actually navigating and figuring out, like, if they’re sharing my story, does that actually improve the situation? Or they are using it as a tool to say, “If you want to circumvent this then don’t get the community involved, don’t do this.” 


Lexi Spencer-Notabartolo: Have you had any oppositional experiences yet, or has it generally been positive in those types of environments? 


Marika Pfefferkorn: Well, the thing is being pigeon holed as community only, like, “Oh, she can bring perspective on community engagement.” Well, it’s not just community engagement, it’s the whole process, because community is a part of the entire process if we actually are talking about equity—and by my definition that means ownership and decision-making. So while they think that’s great, they haven’t incorporated it.  They like the algorithmic improv. That’s great. But not the challenging of the status quo and how you go about your processes, and that you need to pay people for their lived experience and expertise. 


I still find nuance as I go back and reflect about it like all the things I didn’t pick up initially. Yeah.  


Lexi Spencer-Notabartolo: I mean, if anything, that should serve as confirmation of the validity of your gut. That your gut impulses and your instinctive reactions, far from being crazy, are very spot on. And the way the solutions are coming, based on the work that you and the community that you’ve built around you— 


Yes, because this was not a ‘me’ effort. This was a community collective effort. I couldn’t have done this by myself. Obviously. [laughs] 


Lexi Spencer-Notabartolo: Thank you for all your time.