Bonus Episode 12: Sarah Lageson

 

Sarah Lageson

David Eil talks with Sarah Lageson about her book: Digital Punishment: Privacy, Stigma, and the Harms of Data-Driven Criminal Justice.

Professor Lageson is an Associate Professor of Criminal Justice at Rutgers University – Newark.

Date: March 8, 2022

A transcript of this episode is available here.



 

TRANSCRIPT OF THIS EPISODE:

David [00:00:08] Hello and welcome to Probable Causation, the show about law, economics and crime. I'm your host, David Eil and my guest is Sarah Lagoon. Sarah is an associate professor at the School of Criminal Justice at Rutgers University and holds a Ph.D. in sociology from the University of Minnesota. She is the author of numerous articles, both in scholarly journals and the popular press, as well as her recent book "Digital Punishment," which is the topic of our conversation today. Sarah, welcome.

 

Sarah [00:00:34] Thank you for having me.

 

David [00:00:36] So let's just start out by asking how you got to where you are now. You have a lot of young listeners, students and young professionals who are thinking about different career paths. How did you end up following this one?

 

Sarah [00:00:48] I actually didn't know anything about sociology. I went to WashU and there was no sociology department there at the time, so I studied the history and anthropology and then after college, joined AmeriCorps and was, you know what I was looking for an AmeriCorps position. I was really interested in public health and inequality. And I thought I would do something that was more focused on global health. And I saw this listing for a nonprofit in Minneapolis that described mass incarceration, which is a phrase that I hadn't heard of since a long time ago now. And I realized that there was this massive public health crisis happening just to the size and scope of our prison system. So I ended up applying for and and working at this prisoner reentry nonprofit for a couple of years. And I met a sociologist there and was introduced to research for the first time, and I was really fascinated by the methodology of the field experiment that I was a research assistant on in.

 

Sarah [00:01:50] And I just thought it was so interesting that you could have you could go collect empirical data that could speak to something that maybe we know is happening in the world, but we couldn't prove it necessarily or put evidence behind it. And so the professor that was working with our nonprofit, named as Chris again became my advisor and it was sort of off to the races. But, you know, I didn't I didn't know anything about the field or about PhD programs or anything like that. And it is possible to kind of learn as you go.

 

David [00:02:18] Yeah, it seems like so many people who end up with a Ph.D. are people who come from a family of these professors. But you can also get that kind of inspiration and guidance from just, you know, people who you work with or for.

 

Sarah [00:02:31] Yeah, yeah. And it was it was a great mentor. He's still a great mentor. That was really crucial for me to sort of find someone to show me the ropes, and he's still available over phone calls when I need him.

 

David [00:02:43] That's a good, good lesson to us all to pass that gift on to future generations. So I know you have extensive legal training as well. What brought you to that and how has it affected your research?

 

Sarah [00:02:55] I thought about going to law school in college, but it was prohibitively expensive. That was another thing about the PhD program was the funding and support was crucial for me to be able to do graduate school. And so it was something that intrigued me. I took a law class as a PhD student and I did some research for a law professor. And then I, you know, I finished my Ph.D., got my job and I am at Rutgers and we actually the school criminal justice where is my department we actually share a building with the law school. So it's always been like this proximate place. And and it was, you know, I kind of got to this point in my research where I was trying to learn law on my own because there's so much topical intersection of criminal legal system research with with the laws on the books. And I think you can have a great career as a researcher and not have to know the law, but for the kinds of questions I was asking, I was really just kind of yearning for that understanding and understanding that language. And then the thing that really made it quick for me was I was retained as an expert witness in federal court.

 

Sarah [00:04:01] And of course, in anything but civil procedure at this point and was totally confused by, you know, the structure of the courts and the way that questions were answered. Asked and answered. And it's a sort of kind of push me to really go and dive into law school. So I became a law student while I was a professor, which is not common.

 

David [00:04:25] I'm sure after a couple of classes your confusion, was just cleared right up and now the procedure is just crystal clear to you.

 

Sarah [00:04:32] Right. So that's a big learning, right? Is that everything is more complicated. I mean, I think that the biggest takeaway for me and learning law after being steeped in the social sciences is that there's just really fundamentally ways in the world and normative questions and doctrinal questions and research questions are all really different types of questions. On the other hand, though, I have this running list of like 500 new research ideas based on cases I was reading in class because you'd read a case and you'd say, Well, that's interesting, but how would this implement that? How does a police officer think about this change their job? What patterns? We see after this kind of Supreme Court decision.

 

Sarah [00:05:09] So I think there's a lot of room there for social scientists to get ideas from the law. I have a dream of writing a book called Criminal Law for Social Scientists, so maybe in a few years you could add every week about that.

 

David [00:05:21] I would love that. I can't wait. I would love to read that book, for sure. So but first, let's talk about the book you've already written. How did you come to focus on this particular set of questions for a book project?

 

Sarah [00:05:32] This was actually my dissertation, and it looked very different in the dissertation form. But like I said earlier, I was working with the sociology department on an experimental field audit. So we had hired actors essentially to go out and apply for jobs, and they were trading off reporting a non conviction, a very low level misdemeanor criminal record. We were testing the employer outcomes to see, you know, does this type of legally low level stigma matter for hire and how does that change my race? And that was a really interesting question for me because in my nonprofit work, we always talked about a criminal record as being a binary, either having or you don't like a zero one. And now we were testing kind of like the boundary of that one boundary of what would constitute a criminal record, you know? So that was kind of one part of my thinking was like, you know, I think we've got to expand our definition of what a criminal record means. You know, what a stigma really mean. Is it only a felony conviction? And then the guy in my neighborhood started a blog and he was writing about crime in his community, and he was going to the county online jail roster and copying and pasting it, reposting on his blog.

 

Sarah [00:06:45] And then he would highlight addresses that his words were properties owned by slumlords. Or there were just addresses that were in the proximity of this community. And he would sort of do this like blog amateur journalism about crime. And I think what threw me out was like, this is this is just a stigmatizing. It just is shaming or potentially could be used for the same types of discrimination that we have attach only to legal criminal records. And so I called him up. How are you doing this? Is this legal? Where are you finding this stuff? And he was my first interview for my dissertation and then my book, and he's been a key research informant. We continue to talk ten years later pretty regularly.

 

David [00:07:30] So that leads right into my next question, which is that you already mentioned the various methods that you put together in assembling information for the book, which includes, you know, court cases and interviews not only with this blogger, but a whole array of different people who are all the the different kinds of people you talk to in preparing the book.

 

Sarah [00:07:54] Yeah. And the book is really rooted in the generosity of hundreds of people to spend time with me and talk to me. So I started with a mix of fieldwork and interviews. I would go to legal aid or I go to court in Minnesota, where I did my dissertation. You could watch expungement hearings, which raises an interesting question about it being a public hearing in the public record, even though it's about concealing information. But I would just watch the judges interact with people and then I go to legal aid events about expungement and we'll talk about my study as it took shape and recruit interviews there.

 

Sarah [00:08:27] So what have two people interviewed one hundred people I interviewed the book are people who have some sort of criminal records and are dealing with it, and there's a huge variety. There's people who have turnstile jumping arrests only 12 years ago to people who are on registries or served significant time in prison, things like that. So it's really a broad array of kind of backgrounds there. The other half of the people I interviewed were content producers like the blogger, but also people who run people, search websites, data brokers, data aggregators and then lots of experts in the field. So attorneys, legal aid providers who sort of bring my attention to different aspects of the issue that I perhaps wasn't wasn't understanding.

 

Sarah [00:09:10] And then there's sort of a mixed methods component actually using people's criminal records as a data source. So I had a federally funded study kind of going on at the same time, I was doing the book research where I recruited people in two states and paid the state for them to get fingerprinted and get a copy of their rap sheet. And then I was running the research participants names with their consent through all these different types of background checking services and websites and mugshot repositories, and just trying to understand the whole record starts at state level. How does it sort of evolve and change over time? Where do we see patterns of error coming and where do we see different types of records showing up on the internet versus ones that seem to kind of just disappear? So that was sort of like this kind of mixed methods like original data collection, administrative data review, sort of all jumbled together.

 

Sarah [00:10:01] And then, like you said, a lot of legal research and policy analysis and all sort of policy and practice, so I worked with a team of graduate students to look at, for instance, like every state online court system to see what types of personal data every state is, for state looking at the 50 largest law enforcement agencies and whether or not the police mugshot. So not only what is the written policy, but what are these agencies actually doing on the internet?

 

David [00:10:27] Great. Yes. The book is called "Digital Punishment," and I think each word in the title kind of points towards like a unique way that you're looking at the system. Start with the second word punishment. I think normally people think of punishment resulting from a criminal case as being principally incarceration, but also maybe, you know, fines or restitution, something like that. And then maybe the kind of cutting edge of repercussions of a criminal record is collateral consequences, but usually even those are thought of as governmentally imposed or a loss of a governmental rights or benefit like welfare, housing, maybe loss of immigration status, voting rights, occupational licenses, things like that that are government controlled. But a lot of the punishment that you're interested in is punishment that comes from non-governmental actors, right?

 

Sarah [00:11:24] Yeah, that's right. It's sort of I think that for most people in the study, they were most deeply impacted by these extralegal versions of their record because they were totally unmanageable. And there's always the element of mystery and surprise someone would, you know, 12 years, 10 years, eight years after an arrest, then their mugshot would pop up on the internet. You know, it's up for that person the legal outcomes of their case have been settled. You know, they paid the fine or they spent some time in jail or they've dealt with the background check and they've secured employment they talk to the employer and they sort of moved on.

 

Sarah [00:12:01] And then all of a sudden, these things pop up in the private sector or apply for two different jobs that the companies use to different background check companies know that totally different report. So I think a lot of people I talked to that was really, really frustrating because it felt like this shadowy, confusing institution that was causing them all these headaches you know you dig into it. And while that sort of newest mechanism might be the private data industry, you know the harm it is beginning with the state and none of this industry, the criminal records commodities industry would exist had the state not made this information public in the first place. So the extralegal actors really can't play a role until the states decided to share personal data. And it's, you know, I think we think about this as a criminal record again as this one piece of paper that's like your rap sheet, but the book really lays out that there's dozens of different types of state produced criminal record data.

 

Sarah [00:12:54] So Jill repositories, prison directories, mugshot court records and then what's really key is that the state made a decision, whether explicitly or sort of accidentally sometimes to include all sorts of personal data about people. So their height, their weight, the birthdate, their photograph, biometrics, their tattoos, their home address, their lawyers name, like all sorts of details that make criminal record data really, really valuable to the private sector, because usually we have to pay for that like if I'm a data broker and I want, you know, geo location data, I got to pay like an app that's been collecting it on their consumers and reselling it. But here you have data that's governed by public records law, so the price takes quite low.

 

Sarah [00:13:39] But of course, what happens then is that all the data that's being shared and collected by the private sector is structured in the same way that the legal system isn't structured by race, by neighborhood, and by all sorts of bias and discretion, by system level actors. So in that way, I mean, I think this kind of spills into thinking of that only is that the non-governmental actors in can punishment, but also enacting sort of creating surveillance.

 

David [00:14:02] A lot of the stuff has been around for a long time, like arrest records have existed for a while background checks, I would guess, too, certainly, racial bias in the criminal justice system is nothing new. But how has that kind of changed and metastasized with the digital world?

 

Sarah [00:14:22] Yeah. So one thing that like totally struck me in doing this research was how much the U.S. relies on practical obscurity to sort of be a stand in for data privacy. And by that, I mean like paperwork and bureaucracy and sort of administrative hassle used to be the way that these records were actually protected because we don't have a concise conception of right to privacy, especially when it comes to things like criminal legal system data that's deemed in the public record. So the courts, I mean, if you look at how the courts have struggled with the digital turn, they're always kind of stuck thinking about how practical obscurity was this protective factor for us. And that's not I mean, there is no practical obscurity, there's no paperwork, everything is digital. So the first thing that does is makes it makes all the data very, very easy to transmit and all the ways that are really obvious, but I think it also sort of has promoted or created this commodification of thinking about criminal record data as part of a big personal data landscape.

 

Sarah [00:15:29] And in doing that so the data brokers might treat these public records the same as no voting data or real estate data or social media information, but at the same time, you know that we've all these institutions in the US that use criminal records as a basis for discrimination. And so they are a really different type of data. And when they're digital, I mean, it's just sort of like a firehose because each agency makes their own decisions about how data are produced. So essentially, that is what kind of software vendor that they're contracting with or how their data are structured, whether or not that vendor or that software allows them to push information through the internet, whether or not they are implicitly in a data sharing agreement with that vendor, who's selling it to background check companies. And one way to think about it is that, you know, there's about 3000 counties in the U.S., and each of those has probably three to five different branches of criminal justice system within each county. That's making their own decisions about how they're producing information. And then you think about that at the digital scale and it becomes really big and sort of overwhelming, really fast.

 

David [00:16:36] So you mentioned actors using the records as a basis for discrimination, and that includes employers and apartment buildings, things like that, but also maybe even potential dating partners or neighbors, you know, other actors that are less easily regulated and aren't covered by the laws. That's about credit reporting and things like that. You have to be followed by some bureaus that are relied on by employers and banks and stuff, right?

 

Sarah [00:17:06] That's right. They're sort of like the Fair Credit Reporting Act, regulated background checking industry and, you know, for a background check company to be under Figaro, they have to be fresh in Consumer Reports and data brokers are not furnishing Consumer Reports.

 

Sarah [00:17:23] So they exist or this unregulated space, even though in practical terms, to the person who has a record, it's hard to see the distinction between the two, even though they're they're governed by different legal regimes and then there's just sort of the people search industry and that is is also totally unregulated because these websites put up disclaimers saying, you know, we're not responsible for the accuracy of this information or just scraping it for public sources. Do your own due diligence. Don't use this information as the basis of a hiring decision. This is just for your own curiosity and so there's this the sources for which you can get information that there's many more of them in the unregulated sector than the regulated sector will have to pay for it.

 

David [00:18:06] I guess, people might be just curious about, you know, their neighbors past or their the past of, you know, someone they just went on a first date with or something. I think the dating example is kind of interesting because I mean, it's such a private and personal decision that I think a lot of the reasons for choosing someone that we might think of as really morally wrong and against public policy goals in the context of like a hiring decision are really hard to impose on someone in the context of a dating decision. But at the same time, it's a decision that has large welfare impacts on the people who are the focus of it so, you know, you know, say you go out on a first date with somebody and you're interested in them, but you want to know where that about them. You google their name and you see, you know, an arrest from a few years ago. Can we blame somebody for finding that information useful? And if so, like, how should we tell them that that's not something they should care about?

 

Sarah [00:19:05] It's a good question.

 

Sarah [00:19:06] I think there's sort of like a way that I try. I'm trying to reframe that question, I guess, which is, you know, people want to be safe. And, you know, I think we forget sometimes it like meeting people in the internet, meeting strangers, whether at a bar on the internet isn't always safe, right? And we want it to be. And so we use things like Google to try to make this safer. And I think that's normal. And I think that's a good thing to want to protect yourself and to protect your loved ones. I think that we are a bit mistaken when we use the legal system as the barometer of another person's worth or riskiness. So I think like if we want to believe that an arrest record is going to tell us factual information about whether or not someone's dangerous, then we also have to say that we believe the system is fair, that the system is just and the system is accurate.

 

Sarah [00:19:57] And I can tell you that based on all my research, the system is not fair and it's not just it's not accurate. So I worry that people use things like criminal records to make them safe, but it's bad data, it's confusing data, and it's really biased data. And so I think we've sort of substituted in America the criminal record for other types of kind of information that they keep in mind most of the rest of the world doesn't have access to this information and their meeting partners and having relationships and hiring people without having access to things like arrest records. And they're doing fine.

 

David [00:20:32] Yeah, I agree with that, but just to play devil's advocate and push it farther. So the U.S. is a fairly violent country compared to other countries, and even if you know a lot of the records are inaccurate.

 

David [00:20:45] And one of the things your your book details is how inaccurate these records can be, but even so, even if there's a lot of noise in there, there might be some signal to an even with all the bias in the system, there's some information there, too. So comparing two people and one of them has an arrest from a few years ago and one doesn't. Maybe I shouldn't put too much weight on it, but the same time, it might be useful information, at least something that I might want to ask the person about if I are concerned about, you know, putting myself and vulnerable situations with them. So even if we, you know, explain to people kind of all the ways in which this information was, you know, uncertain or, you know, defective, like it could be wrong, you know, this is just an arrest. It may not have led to a conviction, and it might have been just a complete case of either police misbehavior or a misunderstanding, whatever. Now, even accepting that all those things could be the case, it still could be that people could find some usefulness in this information. And maybe some people might find that even that little bit of usefulness is enough to justify the availability of this kind of information, and they're searching for it.

 

Sarah [00:21:57] Yeah, I think that the arrests are a good example here of, I think, the less useful information when we think about the spectrum of criminal record information that we could be using. And so, you know, I think it is very low hanging fruit to say arrests are not reflective of a person's behavior, necessarily, they're much more reflective of a police officer's decision to arrest. When we look at conviction records, so, you know, so there's all these problems baked into the production of that data as well.

 

Sarah [00:22:25] At least we could say, well, there was a police officer, there was a prosecutor, there was a defense attorney, there was a judge and there was sort of a bit of oversight and some sort of process happened. You know, most people are pleading guilty. So you can also sort of raise concerns with the validity of a conviction record. And of course, you know, charges and convictions often are pretty different version of events and what actually happened because of just the way that the system works, but I think that we have a better framework for understanding useful information if we look at conviction records that are recent that are related to the question that you have about a person.

 

Sarah [00:23:03] So if you're you're worried about trust to be thinking about the types of crimes that might be related to that. And what's interesting is that we actually have a policy framework for rap sheets that for most states, incorporates all these things. So a rap sheet is all of the police data, right? Usually it's based on biometrics like fingerprints. So every time you're arrested, the arrest goes in there and then it blinks the court data so you get a final disposition. There are data accuracy and updating problems in a lot of states, but for the most part, the rap sheet is sort of this compendium of like, here's every sort of thing that happened, and here's what the judicial or the court outcomes were, but rap sheets are protected because they are often the policy that surrounds rap sheets rooted typically in most states, somewhere in the penal code or in sort of the administrative regulations around the state police that hold these records. And they're expensive. You have to get consent of the person who is the subject of the record to look at it, they're held to accuracy standards within the state the person has an error on their rap sheet. There's some sort of process where you can petition to have it fixed.

 

Sarah [00:24:10] So we have this great kind of policy framework already that governs conviction records. And that's so if I was like volunteering in a volunteer organization and I want to someone who wants to volunteer, I can go to the state and say, Can you give me the record, right? And then I know the problem is that all these other agencies are really seen in a very piecemeal fashion with no accuracy and no sort of sense of what the information is going to be used for all the conviction and all the incarceration data. And so this is a roundabout way of getting to your question, which is that I think we do have a better version of criminal records in the rap sheet than the conviction data that perhaps is more useful for these types of goals that are valid goals.

 

Sarah [00:24:54] But those are expensive and hard to get, and so we rely instead on the stuff that comes from other parts of the system that's reproduced on Google. And I think it's just by and large, really, really misleading.

 

David [00:25:09] In addition to being expensive and hard to get, the conviction records are also just kind of produced with a different information structure than the arrest rate. We have a strong constitutional commitment for a variety of reasons that conviction should be established with guilt beyond a reasonable doubt. And you know, if of a fact finder is just kind of thinks it's well, it's more likely than not that this person committed the crime, but not sure beyond a reasonable doubt. So they're going to get an acquittal or at least they should. But people, you know, when they're interested in, you know, some kind of relationship establishing some kind of relationship with somebody they might find it's quite valuable to know that it's more likely than not that this person committed this crime.

 

David [00:25:56] And, you know, maybe just probable cause for arrest. Does that likelihood of them having committed a crime is enough for that person to find that information useful? So it's interesting to me that this kind of shadow system stuff sent to provide a demand for information that the legal system is is hiding for, you know, the reasons that the legal system wants to generate information in this particular way in other words, you know that it wants to have record convictions only where certain procedures have been followed. So it's not just go beyond a reasonable doubt, but you know, if you have evidence excluded because a constitutional right was violated in the acquisition of that evidence, then an arrest will not result in a conviction. We have good policy reasons for having that rule, but at the same time, if you're somebody who's you know, thinking about hiring that person as a babysitter or whatever, then you're not so interested in the police violated their constitutional rights and searching them you're interested in what they did and the conviction records are not going to tell you that, whereas the arrest record might give you kind of a good idea of that. So I guess, is it a product, not just of the conviction records being difficult to get, but also that they kind of in some ways have too high an evidentiary bar to record something, perhaps?

 

Sarah [00:27:16] I think that the issue that we run into then is what is the person to make of arrest records, then as an information source used for decision making, if those data also are being produced for that reason. So arrest records are the only reason they're being produced is that it's a public records guidance that we want to watch out the police.

 

Sarah [00:27:35] That's really the foundation of why arrest records are public. An arrest is newsworthy. You know, the media's always been sort of the entity that's pushed for the ability to get information about arrestees. But a lot of the policy framework is really about public records because it's a record of police activity and when data is being produced. For that reason, it doesn't become very useful for decision makers because there's absolutely no context involved. Rates will resonate. It's really just like a list of people who are arrested, maybe why they were arrested in the address where they were arrested. And then that gets sort of swept into big data systems where it's combined with other types of public record data. And so I think that even if you know, there's these sort of valid intentions for wanting to look at arrest information for these sensitive areas, you're still dealing with data that's not produced for that purpose that I think can be confusing and misleading for people who are not lawyers who who do not work in this arena.

 

Sarah [00:28:34] And I think we really run the risk then of seeing just the arrest as a mark of, OK, well, then I can't deal with this person at all. I think in some ways, kind of people can use that as a way to legitimize bias stereotypes, sort of things that they have already kind of made a decision about in their head. And then they can use this sort of messy, unclear arrest record as a way to legitimize that.

 

David [00:28:58] So would one possible policy response then be to, you know, not to obscure this information, but to try to contextualize it better, to better ensure its accuracy. So, you know, there aren't people who are, you know, mistakenly thought to have committed some or have been arrested for something, but they were never even arrested for as the response to it in the opposite direction of privacy and instead make their information better, clearer, more usable to the public.

 

Sarah [00:29:27] Yeah, that's always an intriguing alternative. I think, you know, there's there's always been economists that work in this arena that say, Well, why don't we just go for more information? Why don't we just provide more detail? So I think that that's a great theoretical argument. I think that more importantly, though, it's not a practical argument because these are agencies that are not in the business of furnishing information, they're in the business of policing. And so if are we willing to invest in a big data infrastructure so that we've now made arrests information useful as a policy response? Perhaps I just don't, you know, I don't see that happening. That's certainly not where the resources are going or seem to be. And I think it's also important to remember that the legal system is so fragmented and there's so much local control, I mean, a big finding in the book for me was how idiosyncratic and hyper local the decision making is about how information is furnished. Part of it again, is what software vendor are using, and part of it is also, you know, just some person whose task that the county jail to decide whether or not they're going to put the eye color and the height of detainees on the website. And they kind of just Google around to see what the other counties are doing. They decide to do that, too. So in theory, like would perfect information solve all these problems?

 

Sarah [00:30:42] I mean, if it was paired with a robust education on how the legal system works, that perhaps practically, I think that that would be asking law enforcement or asking the courts to do a new function. And and I don't know what that would actually look like.

 

David [00:30:59] I'll follow up with a kind of surrogate question for probablecaution's overlord Jen Doleac. So she published a paper with her coauthor Ben Hansen on ban the box laws, which kind of have some of the same structure in ban. The box laws typically prevents employers from asking potential employees about a criminal record on their initial application. And one of the things they find is that when you prevent employers from asking you about something that they may have real interest in, like somebody's criminal record, they may use proxies for that criminal record in particular race. So they find that when you institute ban the box laws, it hits young black men particularly hard because that is a demographic that's particularly likely to have criminal record. And if employers are prohibited from asking about that record and, you know, distinguishing the people who have it from those who don't, they end up discriminating on the proxy of race instead. Would there be a risk of increasing that sort of discrimination if there were more privacy around arrest records and things like that?

 

Sarah [00:32:08] I think it's a really good question, and I I too published a study around the same time with a similar finding it was a non-significant, but still a finding in our experimental field audit when we look at the scenario, when we send our testers out and the employer did not ask about the criminal record, not because of the boxes that wasn't on their application and we saw similar troubling pattern by race. The thing about the statistical discrimination findings, it always is tough for me, though, is that, you know, I think that a lot of black people have been wrongfully arrested and they still have to contend with having an arrest record. So I think sometimes we reserve the conversation for someone who is able to evade police contact with being part of a totally overpoliced group.

 

Sarah [00:32:50] And so if we don't bring that directly into the conversation, I think that sometimes it's hard for me to speculate on what is the right policy outcome here about obscure information. And I think in general, people who have been arrested are very employable people tend to do fine. So I don't know that like it's an obscurity policy that I'm advocating more in the same way that like ban the box is obscuring the criminal record at the application stage, but an employer can run a background check the next day if they're moving forward with the applicant. I think it's more for me is thinking about a criminal record, like a medical record, like thinking about it as something that is an institutionally produced document from disparate sources that has a very significant value to the person who has the record. And then that record is used for other people to make decisions about you, whether it's health care or insurability or whatnot.

 

Sarah [00:33:43] And if we were to think about a criminal record in that way as something that we give control the data some ways back to the person who's the subject of the record, I think that's actually the policy outcome that would that would be the most beneficial rather than they will obscure part of the record and make part of the record public. I think some thinking about it in a more holistic way and giving people control over it. And I think that if we're going to use criminal records and implement to the degree that we are, that there should be an institutional framework around how to do that. And I think that transparency laws and public records laws that were written for the paper era, they were written for the press club through the era just fail miserably at comporting with the societal decision to use criminal records in decision making.

 

Sarah [00:34:26] So I think that the medical record analogy sort of helps me think about who has control over the data released can see their data. I mean, one of the big findings of my study, New Jersey, was that no one had ever seen a copy of their own rap sheet, so they had no idea because it was. It cost 40 bucks to go get fingerprinted. No one had any idea what the players were saying. They had no idea what decision makers were even looking at. They didn't even know what the court's ruling because they didn't have access their own data.

 

David [00:34:52] Yeah, a lot of your book talks about kind of how people respond to this sort of digital punishment. What are the different strategies people have when they figure out that they're being denied opportunities because of this information that people are seeing?

 

Sarah [00:35:07] Well, it's really tough because people often they don't know if that's the reason, right? They don't know if their landlord googled them. I mean, every once in a while, landlords would just tell them, Hey, I Googled you and, I'm going to rent to you. But for the most part, they don't know, and they don't know based on where that person is using Google or what that person's own like search algorithmic histories look like, whether or not they're even going to pop up. This is really scary and confusing landscape of not knowing what information is actually out there, and so most people understand and be completely overwhelmed by that. And so there's a lot of research in my field, in other fields that looks at systems avoidance where there are people who feel either under surveillance or stigmatized institutional ways, try to avoid these institutions. And in my context, it was really people trying to avoid social or professional situations that might trigger a Google search.

 

Sarah [00:36:04] So it actually flies in the face of everything we know about sort of crime prevention, right? Which is how do you help someone with assistance or help someone move on from a criminal path? Well, you offer, erm, ensure that they have a partner or a family sort of like stability and that sort of domain, safe and stable housing, safe and stable employment. And when people are evading these situations that might reveal their criminal record and they don't even know what their criminal record is going to say because it looks different based on all these different sources. Well, then they're not participating in the current pro-social activities that would actually help us reduce crime in the long run. So it's like systems avoidance and then in my book, I talk about the digital avoidance because they're really overlapping. I mean, you can't you can't sort of avoid a background check or avoid a Google search at this point.

 

David [00:36:55] Yeah. And also, as you describe in the book, I mean, maybe 30 years ago or something like if you had a conviction that was giving you trouble, then if you can get it expunged, then maybe that solves your problem. But with these private digital databases, it's it seems very difficult to control that kind of information.

 

Sarah [00:37:16] It's really tough because we're in this very busy policy moment of clean slate. And you know, there's a lot I could say about Clean Slate. A lot we don't know about Clean Slate yet, but a lot of Clean Slate policy is targeted towards automating expungement for very, very low level things like arrests solely or for convictions from a long time ago. And so it's it's sort of it's a narrow policy for just certain types of records, but it is sort of reflective of this broader [00:37:47]Xigaze [0.0s] that legislators at least seem interested in expungement.

 

Sarah [00:37:51] And being able to leverage that as a solution came up a lot with legalization of marijuana, which is how do we correct the racist history of these policies will offer these big blanket expungement? And it's a way, I think, for the state to kind of offer forgiveness or to recognize what they might think of as rehabilitation, but, you know, an expungement only targets one version of your record. It's a court record. And then, you know, in a state that has kind of their stuff together, they will notify the other agencies, the state police and other the every jail where that person may have been held, things like that to have them update their data.

 

Sarah [00:38:30] But it really is still a very manual process to leverage expungement at the state level, let alone of the private sector level. And I think asking people to kind of like in New Jersey, I have people in the study who got an expungement and they had to serve it to all these entities themselves. They had to serve it to the state prison, to serve it to all the jails and then they get this piece of paper from legal aid that had 200 data brokers, they were supposed to serve with the expungement order, and there is no mailing address. And these companies have no interest  and no liability because a lot of them are not in the business of furnishing background checks in the legal sense. They're just they're just aggregating data. And so it was just really hard to see people kind of feel like this expungement was really successful and they got the expungement order and then had no way of actually leveraging it. You know, there's good stories like this one was drove for like a rideshare app and they were doing continuous background checks and they had been driving for them for years.

 

Sarah [00:39:29] One of the new background checks uncovered something from eight years ago and try to open the app one day to work, and he was kicked out, and over time he was able to get an expungement. He brought expungement order to the office of the rideshare company, and they allowed him to drive again. So I do think there's certainly room where this works, but I think that the hope for a clean slate is really, really limited by the data environment that.

 

David [00:39:53] So is the only real solution for governments like government agencies going forward to just not release this information? Or is there any legislative or corporate Google fix that that could give people some retroactive remedy?

 

Sarah [00:40:11] Yeah, I love that question because I the more I studied this, the less I was convinced that a legislative solution was the best solution because of just the day to day practical realities people work with. I think that it's an algorithm problem. I think it's a Google problem. You know, most people who dealt with digital punishment were worried about their mugshot showing up in their Google image search results. I mean, that was by far the most distressing thing. And you know, people said to me, it's I don't care if it's on the court website or the court website, it's like hard to navigate. Also, like, it's true, like I have this conviction fine radars charged with this. It's fine. But can an expungement get my picture down? It's all really people want it. They want to be able to just participate in society. They want to be embarrassed when they meet a new person or want to volunteer at their kids school or are involved in their church. That was, churches have come up a lot in this research.

 

Sarah [00:41:04] And so, you know, Google's response to this because many, many, many people who have asked Google to change their search results after an expungement or after many years have passed since their arrest or conviction. And Google, you know, it's not. You have to contact the website like, we're a platform. We have no control over this. And this idea that they don't edit content is just it's false. I mean, if we lose the copyright problem, if there's revenge porn, if there is, you know, payday loans, there's all sorts of stuff that Google has sifted out of search results. And I think for most people, that would be the most helpful thing, at least as a starting point was just so that they could have some ability to use Facebook and Google or Facebook is also a big source. A lot of police departments post mug shots because they think it's funny, or they believe that it's a public safety measure and people still use Facebook to, like, talk to their family, they just don't want their mug shot on there, you know, years later.

 

Sarah [00:42:01] And I think that's a valid request. But I also think that there are legislators that are understanding, especially through the lens, that the data release, as is, is a problem and are thinking about ways to just sort of centralize or at least set a state policy for what types of information the police should be producing. Do I need the home address of every person that was arrested? Is it that important to know the in the way the eye color of someone who's in jail right now? And so I think some of the conversations are starting.

 

David [00:42:33] Yeah, you mentioned a couple of times the use of this information as a kind of oversight tool like citizens maybe want to know if the police are doing their jobs well or if they're abusing their power. Police want to demonstrate to the public that they are doing their jobs well, that they aren't abusing their power. Is there kind of a compromise that permits those healthy and necessary public functions without compromising people's privacy?

 

Sarah [00:43:02] Yeah, I think that honestly, I think we've kind of fallen into this trap that, like we've told ourselves at the very haphazard release of things like mug shots or like some county court dockets in other counties that this very messy, haphazard data dump is in any way a systematic release of comprehensive data that could tell us anything about how the systems operated. So I think that we've kind of been distracted by the piecemeal data and it's been at the expense of demanding good information from state agencies because like what we don't know far outweighs what we know about the system because the emphasis has really been on giving the public information about the people who are arrested, detained or charged with crimes.

 

Sarah [00:43:43] So like, I don't know anything about systematic police misconduct, judicial prosecutorial discretion and plea bargaining and parole board decisions, prison conditions. All of that is in this black box. And yet I can go on a website and tell you the home address of everyone who is arrested in Houston, Texas, last night. And so there is very smart legal scholars that talk about the reverse Sunshine effect in the digitization of public records law in general. And it's that we get lots and lots of information about one another, but we get less and less about the agencies that these laws were designed to be putting some light on in the first place.

 

David [00:44:23] You mentioned Facebook a few minutes ago. I think there's also a different sort of digital punishment that people have anxiety about has a different political balance. But, you know, I think people are worried that, you know, something they said on Facebook or Twitter or whatever will come back to haunt them a few years later, maybe something they said that wasn't particularly thoughtful or sensitive and that they, you know, later you have pointed out to them that was a really offensive thing to say. And, you know, in decades past, there is a practical obscurity to those things, to the you might have set them to a friend and not thought much about it, but was not something that was going to be recorded for all time and come back to keep you from being admitted to college or getting a job or whatever. Is there kind of a confluence of these kind of interests with different political backgrounds, but at the same time, I think they share a lot of, you know, wanting to kind of allow people to get their lives back, move forward after a mistake and not have the rest of their life be dominated by this issue. Not having that always be the first thing that comes up on their Google search results or whatever. Is it just a matter of like we all need to learn as a society now that we have all this information about so many people that we're just going to have to get over some things and allow people some grace?

 

Sarah [00:45:47] Yeah, I like sort of like finish the book on that same question and certainly got feedback that it was I was a bit naive and thinking that people would start to forgive because we can't forget. Right? And of course, you know, the book came out right as this notion of cancel culture was just like skyrocketing into our daily conversation. And, you know, I think that we can look to social science for some answers.

 

Sarah [00:46:14] I think that contact theory shows us that when people have exposure to being shamed or stigmatized, either through their own experience or the experience of someone close to them that they change their mind over it, I think that's happened with the criminal legal system in 2020. I think that a lot of people learned about the problems in the system and they didn't know how bad the system was or they had avoided having to learn about it. And they've become less interested in in the mug shot right or believing that the mug shot means this person's bad. I think employers are having an interesting time trying to reconcile their commitments to anti-racist policing or social justice on one hand and then using police records, on the other hand, to make hiring decisions. I think there is more complexity, at least the criminal record context that's been that's been coming out, but at the same time, you know, public shaming is just this highly valued, very clickable thing.

 

Sarah [00:47:10] There's a big industry interested in US public shaming because we consume content on the internet and the speed at which it happens is really, really quickly. So I mean, like the public shaming that all time high because it's at an all time high, you know, many more people are experiencing it or watching someone close to them experience it. So maybe those forces will kind of like interact and we'll get this new context in the near future that has a bit more humility or kindness to it. But I won't put a timeline on that.

 

David [00:47:40] Hopefully, in our lifetimes, you know.

 

Sarah [00:47:41] Yeah.

 

David [00:47:42] So what are you working on now?

 

Sarah [00:47:45] Still trying to understand clean slate. I have a great team. We're looking at expungement and clean slate policy in California and Utah and Pennsylvania and New Jersey.

 

Sarah [00:47:53] One question I'm asking is sort of, is the construction of clean slate law going to reproduce inequalities? Jen and I have talked about this and written about this together. And, you know, it's a very new policy. So we'll see. There's I think there's a lot of similar activity around clean slate the way that we saw run in the box. But I'm also doing mixed methods work there. I'm interviewing people and I'm really trying to figure out, you know, I think expungement can be very, very helpful for a specific set of reasons and we should identify those reasons. I'm also doing new work on sex offender registries, which is really the most extreme form of digital punishment sort of brings a whole new set of questions around the culture of punishment and about data, commodities and criminal records and just, you know, shaming and shunning people from society.

 

Sarah [00:48:40] There's not a lot of people in my field of study registries, so I'm kind of asking bigger question of like, is there even a stigma around studying it? So that's sort of what I'm thinking about these days.

 

David [00:48:51] Great well, your work touches on so many other fields and is valuable to researchers in those fields, for sure. What are the questions that you ask people in other fields? We'll work on more and give you answers, too.

 

Sarah [00:49:05] I would love to see an economist tell me whether or not things like publishing mug shots or partnerships with police in the Amazon Ring Doorbell leads more or less crime. And to do it in a way that isn't measuring crime as just like there's more police reports or I hear there's more reporting of crime because I think there's there's just this is very messy, causal stuff going on there that someone uses my methods isn't I'm not really trying to identify those relationships, but I think they're important. And I know a lot of people ask me, you know, in the long term, is this, is this a good thing or a bad thing when it comes to crime prevention? So I'd be very curious to see some answers to that.

 

David [00:49:47] Hey, Sarah, thank you so much. Sarah Lageson's book is "Digital Punishment." It's been such a pleasure to talk to you.

 

Sarah [00:49:54] Thank you so much.

 

David [00:50:01] You can find links to all the research we discuss on the show on our website, prrobablecausation.com. You can also subscribe to the show there or wherever you get your podcasts to make sure you don't miss a single episode. Big thanks to Emergent Ventures for supporting the show, and thanks also to our Patreon subscribers and other contributors. Probale Causation is produced by Doleac Initiatives, a 501(c)(3) nonprofit, so all contributions are tax deductible. We're so grateful for your support. Our sound engineer is Jon Keur with production assistance from Nefertari Elshiekh. Our music is by Werner and our logo was designed by Carrie Throckmorton. Thanks for listening, and I'll talk to you soon.