Bonus Episode 7: Sarah Brayne

 
Bonus7_Brayne_photo.jpg

Sarah Brayne

David Eil talks with Sarah Brayne about her book, Predict and Surveil: Data, Discretion, and the Future of Policing.

Professor Brayne is an Assistant Professor of Sociology at the University of Texas at Austin.

Date: February 9, 2021

A transcript of this episode is available here.



TRANSCRIPT OF THIS EPISODE:

David [00:00:08] Hello and welcome to Probable Cause Vision, a show about law, economics and crime. I'm your host, David Eil and my guest today is Sarah Brayne, assistant professor of sociology at the University of Texas at Austin. Sara holds a Ph.D. in sociology and social policy from Princeton University and is founder and director of the Texas Prison Education Initiative. Today we will be talking about her recently released book, "Predict and Surveil Data Destruction and the Future of Policing" available wherever you buy books. Sara, thank you for joining me.

 

Sarah [00:00:38] Thanks so much for having me.

 

David [00:00:39] So first, I want to ask you how you got interested in policing?

 

Sarah [00:00:43] Yeah, I'm not sure exactly, to be honest, but I've basically had this longstanding interest in the U.S. criminal legal system dating back to when I was an undergraduate student in Canada. So, you know, the U.S. and Canada are similar on a lot of dimensions, but one axis on which they're really different is the sheer scale of their criminal legal system. So the incarceration rate in the U.S. is something like eight times that of Canada's. And so I started going to grad school hoping to study that the phenomenon of mass incarceration, but I found that there was already a lot of work being done on that and no one is incarcerated, of course, without first having police contact. So as a sociologist, police contact is this really interesting site of the exercise of immense discretion and decision making. And so I started to focus on this feeder mechanism into the criminal legal system of policing.

 

David [00:01:31] And I know your background is your training is in sociology. What do you think are the insights and methodological advantages that sociology specifically brings to the study of policing that a criminologist or an economist might miss?

 

Sarah [00:01:46] Sure yeah. Well, I mean, I'm a I'm a big fan of interdisciplinary work. I was trained in an interdisciplinary program where we had economists, psychologists, political scientists, sociologists, all working together, but I think that there's a couple of things that that sociology really brings here. And the first is that a lot of preexisting work on the police across different fields looks at them as kind of a monolith, or it portrays them as sort of this monolithic organization so there's a tendency to to homogenize cops say, you know, the police do this or the police do that. But ethnographic methods, which is what I use in this book, research, a combination of qualitative interviews and observations can really help to sort of reveal the variation that can exist at the sub organizational level.

 

Sarah [00:02:30] So, you know, not everyone in the LAPD was doing the same thing with data. But then at the same time, the second sort of thing that I think sociology or sociological approach helps with is it's really great at detecting and analyzing patterns. And so sociological theories and methods help to understand things like why might sworn officers be different than civilian employees? Why might a captain react differently to the introduction of a predictive algorithm in his division than a patrol officer, that kind of thing.

 

David [00:03:00] So let's talk about your experience with the LAPD. So it's fascinating the amount of information that you're able to get and how close you're able to get to their work. So you spent a long time with them clearly, and interviewed a lot of people. How did you get access to the organization?

 

Sarah [00:03:19] Yeah, I mean, it wasn't easy, I'll say. And definitely, you know, I'm often asked how I got access to the LAPD because the blue wall of silence, as it's called in policing, is definitely notorious.

 

Sarah [00:03:30] And it makes it really difficult for researchers or journalists or whomever, for that matter, to secure the degree of access that you really need in order to obtain in-depth data on day to day police practices. So, you know, all of my respondents are are de-identified or anonymized, so I can't say sort of exactly who gave me access, but I'll just sort of give them some general points in case it's helpful for folks who are trying to gain access to a difficult to reach organization like this in the future. So first, I had no previous connection to the LAPD. Sometimes people are like, Oh, did you know somebody in there or whatever? And that's how you got access? No, I actually didn't. But I gained access to the department by starting pretty high on the LAPD's organizational chart. And the reason for that is that I figured that since police departments are these really hierarchical organizations, you know, they rely on chain of command. I figured that if I gained access at a point pretty high in the chain of command, the permissions would cascade down the ranks. You know, instead of just getting one patrol officer to talk to me, for example, that wouldn't necessarily get me like the next interview. I started with a captain who was relatively high in the organization. I just had one meeting with him, and then at the end of the meeting I asked, you know, is anybody available to take me on a ride alongs? And then I go on a ride along, which is basically a seven or eight hour interview. And at the end of the ride along, I said, Hey, can you give me the email address of this captain that you mentioned and another then another division, that type of thing.

 

Sarah [00:04:51] And so the first six weeks of that, what we call snowball sampling, was definitely the most grueling and difficult period of trying to get access and doing my fieldwork, because it involved a lot of rejection and a lot of cold calls and ignored emails and loitering around division offices and that kind of thing. But then as time went on and I talked to more and more people and more and more divisions, I started developing these working relationships with folks. And the dynamic really shifted from struggling to get people to talk to me to then having people start to invite me to meetings or have conversations or call me after a shift to tell me that something that happened that they thought might be relevant to my research or, you know, one time they were like, Hey, do you want to go on a ride along in a helicopter? I was like, Sure sounds good.

 

David [00:05:37] So what were the kinds of things to the extent that you could tell? I mean, what were they worried about in your first contacts with them that you had to kind of give them to understand they shouldn't be worried about? And conversely, were there other things that they were excited to tell you or benefits that they saw from the relationship that encouraged them to talk to you?

 

Sarah [00:05:57] Yeah. I mean, so the main notion I needed to disabuse them of was that I was not a journalist. They were pretty hostile to journalists because they basically are concerned that a journalist is going to come in, learn a little bit of stuff, and then write a high profile hit piece on them, basically. So, you know, I was like, no, I'm an academic our our time horizons are much longer.

 

Sarah [00:06:17] Like anything I write is not going to come out for, you know, a couple of years, this type of thing. And I really like foregrounded at that time my identity as a student, which is like a relatively non-threatening identity and sort of my position myself as just wanting to understand, you know, as a department who is on the front lines of police use of big data, what are you doing? How are you using it? Like what are all the ways? And then the other side of it, which is what they were really eager to talk to me about, was like their experience with it so rather than just like uncritically assuming that as these new techs are introduced into their workplaces, they just deploy them kind of automatically. I talk to them as like human beings and employees kind of embedded in an organizational structure. And I mean to a certain extent I think like people like to feel heard and like have a chance to express grievances about their jobs and that kind of thing. And so they definitely didn't shy away from from sharing their complaints about the new sort of digital state of affairs.

 

David [00:07:21] When you were writing it, did you think often about, you know, I wonder if this is what they're expecting me to write or how they're going to react to it? And have you had any reactions to it since it's been published?

 

Sarah [00:07:32] Yeah, definitely. Although I did like when I was in the field bring to some of them some of what I was thinking about and writing because like one of the things that also was important to me was to make sure that the descriptive facts were correct. And so I did sort of a lot of fact checking with them while I was in the field saying, you know, is this accurate how you use this particular tool or, you know, a lot of the time also, I would just be like cops you so many acronyms would be like what are what is this acronym even stand for? You know, like what are we talking about? And in general, they were pretty receptive and interested in the sense that, you know, like I do sort of accurately portray how they use these different tools and what the different tools are that they use the points of disagreement or more about like normative assessments or implications of their use.

 

Sarah [00:08:21] So like I'll give a concrete example. One of the arguments in the book is that the proliferation of all of these, you know, new kind of dragnet surveillance tools is that the cops can now monitor a whole bunch of people who don't have any direct police contact. And so that's one of my main arguments. I remember talking to a captain at RACER or the Real Time Crime Analysis Center about this, and he was like, I don't agree. And so I said, okay, you know, why don't you agree? And he was like, Well, you know, you still have to have police contact in order for us to be in our system. And so I was like, well, what about automatic license plate readers? Right? They collect information on everybody rather than just people under criminal suspicion. And he was like, Yeah, but I'd say that they're an exception to the rule, whereas like I think that they are more indicative of sort of a broader trend that's playing out. So yeah, I guess all that is to say, there was definitely like some certain points of disagreement normatively or like thinking about the implications, but it was received pretty well in terms of actual representation.

 

David [00:09:20] So a lot of what you uncover is, at least as far as I can tell, new I mean, the public did not know about these tools that the police were using before you found out about them. Why is it that this information is not available to the public when there's, I think would be such intense public interest and it and affects people's rights so heavily.

 

Sarah [00:09:42] Uh huh, yeah. I think, you know, in the seven or eight years since I started this research. The public has started to become much more aware of what's going on.

 

Sarah [00:09:50] But at the beginning and still to a certain extent now, there's so little understanding of the sheer scale of police surveillance, I think. And, you know, unfortunately, of course, people in heavily surveilled communities have experienced and felt the weight of policing. But even they at the time didn't know how there were criminal risk scores being calculated on them, for example, that type of thing. And I think this kind of information is shielded from public view for a couple of reasons. One is that the technology moves so much faster than the laws and regulations governing its use. So the police are kind of operating in this gray area where it's like, would this practice hold up in court with this source of data count as evidence? And, you know, I don't know, maybe, maybe not. So the police kind of have a vested interest in keeping some of these techniques under wraps because it's impossible to hold something that's invisible accountable. So I think that might be one of the motivations.

 

David [00:10:42] As you note in your answer, there's a difference between kind of some evidence that leads the police to take investigative steps, that gets evidence that is then later introduced in court versus evidence that itself they're going to have to introduce in court in order to prove guilt. And it sounds like a lot of these things are the first kind. Like, you know, they they're able to narrow down suspects significantly through the light readers and then they go interview them and gather the other kind of more traditional kinds of evidence that people would expect police to do for the last however many decades and as you say, maybe people never find out about the use of the license plate reader. Do you think people should be still troubled by this this use of data, even if it's going to kind of later require some corroboration from some more traditional kinds of evidence?

 

Sarah [00:11:39] I mean, I would argue, yes.

 

Sarah [00:11:40] And when I present this work to lawyers, for example, defense attorneys in particular are often kind of horrified because they're like, this is insane, because I had no idea the means by which my clients are coming under suspicion in the first place. Right. Like, I didn't know why the cops were sitting outside my client's house or just happened to be there, you know, as he exited this garage and this type of thing. And, you know, like a counter argument is sort of the well, if you have nothing to hide, you have nothing to fear. Like, why do you care if the police are collecting data on you or you're in their databases? If you haven't done anything wrong, like if you haven't done anything wrong, you're not going to get caught.

 

Sarah [00:12:14] Well, I think that that assumption relies or that kind of logic relies on this assumption of an infallible state. Right. Like of the idea that all of the actors that are entering information, making decisions and inferences based on this information do so without any error or bias or prejudice. And that's an assumption that's just not borne out in research. I mean, we have research demonstrating that that error is is unequally distributed. So, for example, there's a study out of Michigan that's just the black folks are seven or eight times more likely than white folks to get wrongly convicted of murder. And there's sort of a link to DNA databases potentially in that, in the sense that in order to be a hit, you have to be in a database in the first place. So I think that even just like unequal database inclusion is something that we should know about.

 

David [00:13:05] You also talk about function creep a fair number of times in the book. What's function creep and how does that relate to data?

 

Sarah [00:13:13] Well so, function creep is basically just the idea that data that is originally collected for one purpose can be used for another, which is often an unanticipated or unintended purpose. And it's really a fundamental component of the big data landscape. There's this line by a couple of surveillance scholars, Mark Andrekovich and Kelly Gates, that says "function creep is not ancillary to the data collection process. It is built into it. The function is the creep." So I think that, you know, a lot of the time what's happening is the police have long collected their own data, but increasingly they're securing routine access to a range of non-police databases from organizations and institutions that have nothing to do with the criminal legal system or crime control. And so it's really that repurposing of data that is function creep.

 

David [00:14:01] So one interesting Supreme Court case that came down while you were writing the book, maybe even, you know, as I was almost going to the print, which is on this issue, is that the case is Carpenter against United States. All the lawyers listening will already know what I'm talking about, but for anybody who doesn't know about it, this is a 2018 case that covers police subpoenas of cell phone location data that was retrospective. So I believe the police focused on some individuals for some other reason and then went to the cell phone companies and said, we want to know where this person was based on their cell phone records for this period. I think it was a whole year that they asked for records for and the Supreme Court said that that violated the Fourth Amendment. So how did that kind of intervene in your writing process as you're you're finishing up the book? And what impact do you think that case might have going forward?

 

Sarah [00:14:59] Yeah, so it did it was happening right as I was writing the book, basically.

 

Sarah [00:15:03] So in in the chapter that I talk about Carpenter and well, when I talk about the law, basically in all of this, I just had this like big highlighted yellow blank section, that said "TBD", you know, Carpenter decision kind of because I knew that it was going to be really consequential. And one of my favorite parts about the case as well, which is like, you know, particularly germane to anything, is that the guy, Timothy Carpenter, was accused of stealing cell phones. So there's an irony anyway. So so, yeah, as you suggest it was it was about this CSLI or cell site location information. And I think the implication of it, although lawyers would would definitely be better equipped to answer this than I. But I think that it basically sidestepped the third party doctrine, which is like this legal doctrine that holds the people who voluntarily give information to third parties like banks, but cell phone companies being what's relevant here, but also Internet service providers, email servers, whatever, that they have no reasonable expectation of privacy.

 

Sarah [00:16:04] And so the Carpenter decision, the majority opinion held that police getting more than six days of CSLI out of the cell site location information constituted a search for Fourth Amendment purposes and therefore violated reasonable expectations of privacy. And one of the reasons for that is that the argument was that it's basically impossible in today's age to opt out of having a cell phone and cell phones pin your location all of the time. So like for many jobs, particularly in the gig economy, but a whole bunch of other jobs as well, you need to have your cell phone on a lot of people it's like their only phone, this type of thing. And so so this idea that you are voluntarily giving your information to third party companies is sort of considered anachronistic in the digital age according to this opinion.

 

Sarah [00:16:53] And I think that the implications of Carpenter potentially extend beyond the specific context of just cell site location information it might serve to protect personal digital trails with all kinds of third parties, like automatic license plate readers, which we talked about earlier, or social media posts, but my understanding is that it also sort of remains to be seen like how the lower courts are going to deal with third party doctrine moving forward.

 

David [00:17:20] Yeah, I think it's fair to say that it's. It bridges new ground, but it's also kind of tentative. The other thing you talk about throughout the book is tech washing. What's tech washing and why is that important for data policing?

 

Sarah [00:17:36] Tech washing is kind of a riff off of like greenwashing or whitewashing. It's it's this idea of replacing or at least appearing to replace subjectivity and things like legally contestable bias with objective and neutral and colorblind numbers through processes like computation or quantification or automation. So a concrete example is like the idea that it's problematic to stop somebody because of their race. Well, the logic goes maybe it's less problematic to stop them because they have a high criminal risk score, because that's, you know, data speak for itself numbers are unbiased they come with sort of an air of what Ted Porter calls mechanical objectivity. Even if those scores are in part a proxy for some protected category like race.

 

David [00:18:23] So even without the data, police are going to be exercising their discretion in various ways. I mean, I think your subtitle to the book captures a lot of the tension. There is data, discretion and the future of policing. I think, as you say, a lot of people at least hope or maybe imagine that this use of data can kind of substitute for individual police officers discretion, but without the tech wash the data, there's still the fact of the decision, and it leaves police without guidance. And many people are worried about how they'll use that guidance. Is there a way to use the data that directs individual police officers towards their decision making without just kind of tech washing the old sources of bias and inequality? Or is it just kind of a tension that we're stuck with?

 

Sarah [00:19:20] Yeah, well, I think that like one of the theoretical arguments for how police can be used to to reduce bias and problematic levels of discretion is sort of exactly what you were talking about here, this idea that it can correct for incomplete information that individual officers may have. So social psychological research, for example, demonstrates that humans are cognitive misers, which basically just means that we rely on cognitive shortcuts in order to make decisions. And a lot of the time, you know, police officers are operating in an instances where they don't have a lot of information, they need to make decisions quickly. And that's really when humans rely on stereotypes the most is in those moments with incomplete information. So the logic there goes that, like if you're able to fill in some of that incomplete information with with accurate or unbiased data, that it might sort of reduce bias in the decision making. I was not able to see that play out in practice. And I think that there are like I don't think there truly is any such thing as as raw data. But that was sort of one of the theoretical arguments, at least for that. Now, that's not to say that, like, I don't think that there are potentially that there is a role for data moving forward. But we're in this moment of national or I guess, international reckoning about the future of policing.

 

Sarah [00:20:42] And there are all these discussions, right, about variously reforming or shrinking, defunding, abolishing the police. And I've noticed that in a lot of these conversations, some folks are suggesting that data driven policing should be part of the solution. They're suggesting we can apply big data to any number of problems in policing, like defunding the police need to cut costs for data can help allocate your resources more efficiently or need to reduce racial bias in officer decision making automate it or use predictive algorithms. But I think that big data is not necessarily or it isn't a silver bullet in this case. And contrary to popular accounts, I think that this type of ethnographic approach shows that big data is not necessarily more objective or less biased than discretionary human decision making. And in that sense, algorithms kind of don't transcend, but rather are shaped by the social world in which they're used.

 

Sarah [00:21:36] So I think that like what we can do, however, is think a lot broader about how data can inform our allocation of resources. So one of the key key challenges today is that we don't currently have a fully viable operational alternative to the police. And so we could take up on this investment in alternatives to the police, use data to allocate non punitive resources, get away from this sort of like if you're a hammer, everything looks like a nail. Well, of course, if your only interventions are punitive and it's going to exacerbate existing inequalities, but if we can use data to direct non punitive interventions, it could at least in theory, and we could test in practice whether it helps to sort of reduce some inequalities and reduce crime.

 

David [00:22:21] It's also and this is a point that you take up various points in the book the data and its use is so opaque. And if you think that one of the ways to resolve people's alienation from the police is to kind of move more towards what people call community policing, which can mean a lot of different things to different people. But but, you know, the idea that, you know, it's more your friendly neighborhood police officer is walking around all the time. And as everybody's name and kind of tailors their practices to what their understanding is that the needs of the community and gets a lot of information from the community both about what's happening and what their preferences are. It seems like if your your goal is to decentralize in that sense organizationally, then you have to move away from this kind of centrally directed and opaque to the community source of information and decision making.

 

Sarah [00:23:22] Mm hmm. Yeah. And I think that that that that's part of, like, why some of this big data policing stuff has eroded. Community trust has really had the opposite effect is because it's largely invisible. Right. Like community policing is all about having this visible, non punitive police presence. Right. An ongoing dialog between community members and members of the police. Like you said, there's a million definitions of it, but big data policing is largely invisible, right? It's not this like 1 to 1 police civilian interaction on the street. It's it's police data set interaction. A lot of it is very invisible and when it's rolled out under community members noses without buy in on the front end, that can really serve to erode that legitimacy of the practices as well.

 

David [00:24:06] And also, maybe most interestingly, it erodes the interest that police officers have in their own jobs. So that was some of the most fascinating parts of the book, was to learn about the tensions within police departments, about how to use this data and whose job it encroaches on. And it seems like at least they kind of beat officers and street level police officers feel like it's kind of moved in on their territory a little bit.

 

Sarah [00:24:35] Yeah, I mean, this is the beauty of ethnographic research is unexpected things come up, you know, it's full of surprises. And so when I started my fieldwork, I was kind of ambivalent about how the police would respond. So like on, you know, on one hand, criminological research and media portrayals like sort of portray the police as having this voracious appetite for new technologies and, you know, say it's akin to Minority Report, it increases their surveillance capacity, all of this type of thing. But then, on the other hand, work by like working employment scholars would suggest that there might be some resistance from officers. And so I started to get towards some of the answer actually on my very first ride along, the sergeant pulled up to this this house, this vacant house, and entered manually and his laptop that he was code six, like he'd arrived at the address and that officers were investigating. And in that moment, I was like, Oh man, I picked the LAPD because I thought it was super technologically advanced. Like, why is he manually typing this into his computer? And I asked him, you know, isn't there some automated mechanism for knowing where the patrol cars were? And he was like, oh, yeah, there is every car is equipped with an automatic vehicle locator or AVL that pings the location of the vehicles every 5 seconds, but they're not turned on because of resistance from the police officers union. So it was like in that moment that I realized, you know, there's really a labor story here, this this idea that, you know, tech is neutral or data is unbiased, this kind of thing.

 

Sarah [00:25:56] Well, that just like does not play out when the police are the ones who have surveillance turned on them. And as you mentioned, you know, I found that there was variation in the department where patrol officers tended to resist more than managers. You know, patrol officers viewed this as like an entrenchment of managerial control, like a de skilling, and they really resisted it in ways that managers didn't. So, yeah, I found that part of the field work really, really fascinating.

 

David [00:26:24] And you also talked to some of the tech people within the department, too, it seems like.

 

Sarah [00:26:29] Yeah. So there was also another division between sworn officers and civilian employees. Civilian employees are like crime analysts, this type of thing who, you know, having gone through the academy and they have kind of a different professional identity, definitely different training, different professional identity. And I found in many cases, they were sometimes more critical and more candid about how technologies were used. They didn't seem to have sort of the same same type of allegiance.

 

David [00:26:57] So they didn't kind of have above the law enforcement mindset quite so much, maybe.

 

Sarah [00:27:03] Yeah, not so much. I mean, like there was in the sense that they were still sort of working to the same mission. But like a lot of the time, you know, I would ask somebody in fugitive warrants, like, how do you do X, Y, Z with some platform? And, you know, the interviewee would be like, that's law enforcement sensitive. And then I would ask the same question of somebody who works in information technology like a civilian employee. And he'd be like, Oh, well, I would query this data set. So yeah, that kind of variation was also pretty interesting.

 

David [00:27:32] Does the data use change also the way that police departments interact with each other? I mean, my impression is that, you know, one city police departments is pretty self-standing organization and they might they might call, you know, over to even a neighboring city sometimes for some information or something. But they're quite separate organizationally if they're using the same data sets provided by the same private companies, and maybe information is flowing back and forth between them and through those data sets. Does that kind of integrate these two different jurisdictions more?

 

Sarah [00:28:08] Yeah, I mean, they're more integrated, but everything is far from being like fully interoperable. So there are institutions like fusion centers, for example, the one in Southern California is called JREC, the Joint Regional Intelligence Center. These fusion centers are these like federally funded surveillance organizations that were largely built in the wake of 911, which was kind of viewed as a case of information sharing failure in the intelligence community. So, you know, you can local law enforcement agencies can call these agencies, but typically it still is like relatively piecemeal. There isn't just wholesale access to other law enforcement agencies, data sets or anything. There are some federally managed data sets. But I do think that that's the direction in which ideally it's moving toward. So in L.A., for example, you know, Santa monica has their own police, even though that's like kind of right in in L.A.'s territory. And so, you know, when there was like a robbery series that was happening and one of the houses that the person robbed was in Santa Monica, they still had to like phone Santa monica P.D., basically, and get the information. So I think, like in many ways, there was this tension in my work where, you know, one day I'd be totally shocked with the extent of police surveillance. And then on the other, you know, the very next day it would be like, oh, wow, this basic data integration thing hasn't occurred yet.

 

David [00:29:29] It seems like there's also at least some instances of kind of frictions between the police department and the providers, like Palantir is the one you talk about the most where, you know, seems like maybe the police don't quite understand what Palantir is doing maybe volunteer doesn't totally understand what the police wants. Is that also a culture clash or just objectives clash of know Palantir and then wants to make money?

 

Sarah [00:29:57] You know, honestly, most of what folks who were familiar with Palantir in the department had to say about the company was like really positive. They were saying that Palantir is super responsive to their their needs and that kind of thing. And in the context of Palantir., you know, like talking to them about some of the ethical concerns, there's sort of a tension where, you know, on the one hand, it's like they say, you know, we build in these these different things like access controls or immutable audit logs and stuff. But then if their clients like if the LAPD never uses them, then the existence of that technology is kind of moot. But this is like why there's a problem that we have no robust regulatory regime at the federal level, let alone the state or local level governing the use of these technologies. It's just like the Wild West right now.

 

David [00:30:45] Do you get the sense that the people working at Palantir understand that in decisions they make and information they collect will lead to people being arrested and imprisoned, possibly wrongly, based on what they provide?

 

Sarah [00:31:02] Yeah, I mean, the people that I look at, there's people who have all different kind of roles in the company. You know, like a a programmer is different than somebody who is like on the privacy and civil liberties team or something like that, but yeah, I think there is generally an awareness of the stakes, but at the same time I think there also is like it's hard to sort of, you know, I can't, I can't speak for like everybody in the company or anything like that. But I think there also is a faith that technology can be leveraged to solve social problems, like there is a certain amount of buy in that logic, too.

 

David [00:31:33] Has everything that you've learned in this book and researching this book changed the way that you kind of go about your everyday life? And do you when you're driving out of the street or walking down the street, do you think like, Oh, all of these cameras are taking photos of me right now and all these different people are going to have access to that footage if they wanted it and the police know that I was at this intersection at such and such a day, etc..

 

Sarah [00:32:00] Yeah. I mean, I like I vacillate between on one end of the continuum being totally fatalist about it, like, okay, well, we're all under surveillance all the time like already, you know, blah, blah, blah. And then on the other hand being like know that that is seeding to the very dynamics that I'm trying to elaborate on in this in this book and in this research. And then I'll engage in these, you know, minor strategies of resistance, like insisting on a manual pat down at the airport or something.

 

David [00:32:35] So what's the kind of next frontier in this area of research? What do you think are the most important next projects?

 

Sarah [00:32:42] Yeah, well, I have a couple. So one of them is following this stuff into the courts, basically. So this research really stops at the point of police contact or at the point of arrest and I'm really interested in how digital information, broadly defined, like, yes, evidence being one of them, but also just this investigative information, how it's used at subsequent phases in criminal legal processing. So like how does it come up in the context of plea bargaining, in the context of of sentencing, this type of thing. So that's one one project.

 

Sarah [00:33:15] And then another one is also broadening the scope of institutions that I'm looking at. So I think that data driven decision making and predictive algorithms and stuff, of course, this is not the exclusive domain of the police or the criminal legal system in any way. There's a whole range of institutions and organizations in which actors are adopting these kinds of tools for making decisions and allocating resources. And so I'm interested in doing some comparative analysis to look at, for example, like how the same types of predictive technologies when deployed in different organizational environments, how that plays out similarly or differently. So before COVID, I had something with health care organizations, but yeah, thinking about health care, education, that type of thing.

 

David [00:33:58] Great. Well, I can't wait to read it. My guest has been Sara Brain. Her book is "Predicting Surveil Data, Discretion and the Future of Policing." Sarah, thank you so much.

 

David [00:34:08] Oh, thanks so much for having me.

 

David [00:34:15] You can find links to all the research we discussed on the show on our website probablecausation.com. You can also subscribe to the show there or wherever you get your podcasts. To make sure you don't miss a single episode. Big thanks to Emergent Ventures for supporting the show and thanks also to our Patreon subscribers the show's listener supported. So if you enjoy the podcast too, please consider contributing via Patreon.

 

David [00:34:40] You can find a link on our website. Our sound engineer is Jon Keur, production assistants from Hailey Greishaberr. Our music is by Werner and our logo was designed by Carrie Throckmorton. Thanks for listening and I'll talk to you soon.

 

Jennifer DoleacBook Authors