EPISODE 1888 [INTRODUCTION] [0:00:00] Announcer: Surveillance technology is advancing faster than the laws meant to govern it. Across the United States, police departments are deploying automated license plate readers, facial recognition tools, and predictive systems that quietly log the daily movements of millions of people. These tools promise efficiency and safety, but critics argue that they represent a form of warrantless mass surveillance and raise deep constitutional questions about privacy, accountability, and the limits of government power in the digital age. Michael Soyfer is an attorney at the Institute for Justice, a nonprofit public interest law firm focused on defending individual rights. His work centers on the Fourth Amendment and the growing use of surveillance technologies by local governments. Michael joins the show with Kevin Ball to discuss the rise of Flock safety cameras, the Institute for Justice's lawsuit against the city of Norfolk, Virginia, how decades-old legal precedent struggled to keep up with modern technology, and what citizens, technologists, and policymakers can do to protect privacy in an era of pervasive data collection. Kevin Ball, or Kball, is the Vice President of Engineering at Mento and an independent coach for engineers and engineering leaders. He co-founded and served as CTO for two companies, founded the San Diego JavaScript Meetup, and organizes the AI in Action discussion group through Latent.Space. Check out the show notes to follow Kball on Twitter or LinkedIn, or visit his website, kball.llc. [INTERVIEW] [0:01:44] KB: Michael, welcome to the show. [0:01:46] MS: Thank you. Thank you. It's great to be here. [0:01:48] KB: Yeah, I'm really excited for this conversation. Let's start with a little bit of introduction about you and the Institute for Justice. [0:01:55] MS: My name is Michael Soyfer. I'm an attorney at the Institute for Justice. I've been here since early 2024. I primarily work on issues related to the Fourth Amendment, especially surveillance of people's movements and public spaces, and kind of the growing government surveillance state across the country. The Institute for Justice is a national nonprofit public interest law firm. We litigate cases in a bunch of areas. The one that's kind of closest to my heart is our project on the Fourth Amendment. [0:02:29] KB: Yeah. And I think for those wondering listening to a software podcast, what are we talking about here? I think some of these recurring themes around how technology interacts with our rights and the legal system and all those things are something I'm very excited to dive in with someone who understands the kind of legal half of that. Let's maybe look at this particular Fourth Amendment case and Flock cameras in particular. Can you kind of give us the high-level overview for folks who haven't been paying attention? What even is this situation, and how the surveillance state is growing with it? [0:03:01] MS: I'll talk about our case in Norfolk, Virginia. First, we'd noticed that Norfolk had installed a network of 172 Flock automatic license plate readers. They're cameras that, when you pass them, even at high speeds, they take a picture. And then from that picture, they read certain features of the car. The most notable being the license plate, which is a really easy way to track people and connect their movements at the same location over time. But they also gather other vehicle features. Flock refers to this as vehicle fingerprint. That's kind of a marketing term for the ability of its cameras to take down the make of a car, the type of car, the color, and some other distinctive features of the car. It will be able to tell if the car has a bumper sticker, if it has a roof rack, things like that. And that will all go in a police database where police can search for the license plate number, either full or partial. Can search based on those vehicle features and can see every time in the past, usually it's 30 days, where that car was seen and when. And that lets you kind of build this rich picture of where people are going within a city. And in Norfolk, we saw the police chief kind of briefing the city council and telling them it would be difficult to drive anywhere of any distance without running into a camera somewhere. So, we thought that was just bonkers. It was extremely disturbing that the police were gathering this sort of information about people's long-term movements. And at the time we initially sued, there wasn't really state law regulating the use of these systems. So the Norfolk Police Department had just adopted a policy that said the system can only be used for a law enforcement purpose. But what that meant, it was up to every individual officer. And so we partnered with some local Norfolk residents who found that level of surveillance really disturbing. Didn't want the police logging their movements for a month at a time. And we sued in federal trial court in the Eastern District of Virginia and claimed that that ongoing surveillance violated the Fourth Amendment. [0:05:27] KB: So, let's maybe dig into some of the details of how it is. Because I think for folks who work particularly in the web or things like that, they're used to hearing about, "Oh, there's tracking. I can tell where everybody's going on the web if I need to. If a particular company owns that. Or GPS. My phone is tracking everywhere that I'm going and all these different things." What makes the current issue with the license plate readers different than some of these other areas that folks are maybe more familiar with? [0:05:56] MS: The thing that's different is the government is doing it and doing it at a population level. The government is and always will be more powerful than any of the companies that are maybe logging your location here and there through an app, or even your cell phone company that's collecting information about your movements for its own business purposes. The government is just different. It's collecting information to keep tabs on us in a way that just wasn't possible in the past. And what we've seen is, as it's become more difficult for the government to access private repositories of this type of information, they've tried to develop ways to just gather it on their own. For instance, the government used to be able to go to many private companies that gather location information. And with a simple subpoena or just a request sometimes, get location information from them. The Supreme Court really revolutionized this area of the law in 2018 when it decided a case called Carpenter versus United States. In Carpenter, the Supreme Court held that the government needed a warrant to access cell site location information. That's the information that our cell phone companies collect about where and when we connect to towers. And it will sort of tell you just which tower's coverage zone a person is in at a given point in time. And the Supreme Court recognized that people have a reasonable expectation of privacy, which is kind of the key Fourth Amendment question in that data because it's collecting their movements in a way that's impossible for any given person to do unless that person's stalking you, for instance. Those forms of information have become more difficult for the government to access, especially without a warrant. And we've seen governments in recent years trying to be able to harvest this population-level surveillance information themselves, not having to go through a middleman. For instance, and this is one of the reasons that we decided to file this case in Norfolk, there was a program in Baltimore where the city was flying spy planes 12 hours a day during daylight hours and taking second-by-second pictures of the entire city. The idea was that if a crime had occurred, you knew where it occurred. You could kind of follow people as little pixels on a map to see if they went to a residence, if they went somewhere else, all of that sort of stuff. And the US Court of Appeals that covers Maryland, Virginia, North and South Carolina, and West Virginia held that that was unconstitutional. It violated the Fourth Amendment to gather that type of long-term location information about everyone in a city. And we think that many of these places that are using license plate readers are just trying to accomplish the same thing from the ground that Baltimore was from the air, creating this repository of people's long-term movements that the police can use for really whatever reason they want in many places with very little regulation. [0:09:17] KB: I think that raises one of the key questions here, which is we see all over the place technology will get out ahead of regulation, and then things will catch up in different ways. And you raised the example of cell phone tracking data. Technology got out ahead. We had the ability - the cell companies have the ability to know where somebody, or at least their phone, is traveling through all of this time. Police groups, governmental groups were leveraging that until the case got to the Supreme Court, and they said, "Hey, no, you can't actually do that." For this audience, folks may not be familiar, what is the process of getting that legal trail and establishing what folks can and can't do? [0:09:56] MS: Sure. So, there are a few different forms of regulation that can apply to police use of technology. Obviously, there are department policies and directives, things like that. There's the judicial process, where constitutional law comes into play. And there's the legislative process. All of those have really trailed the rollout of this technology, I think. I'll just start with talking about the judicial process. The Supreme Court is very behind in bringing the Fourth Amendment into the information age. I mean, it first considered cell site location information in 2018, in a criminal case that had been filed around, I think, 2010, 2011. These things take a long time to work their way through the court. And the Supreme Court generally just hasn't been taking many Fourth Amendment cases recently. The process has been extremely slow of kind of updating the Fourth Amendment to take into account all of these new technologies. And so you see law enforcement and Flock and other vendors relying on kind of 80s era Supreme Court case law that considers really crude location tracking technologies. The central case in this area is about police putting a beeper in a vat of a drug precursor to try to follow a suspect. And the beeper - they had to basically stay within a certain distance of the beeper to continue getting signals from it. And that's the case that many police departments, government entities, Flock, and others point to and say, "Well, you have no reasonable expectation of privacy in your public movements. The Fourth Amendment doesn't apply. We can gather whatever information we want." It's very slow kind of going through the judicial process. We ourselves are trying to work our way through that. And this technology rolled out a lot faster than courts or legislators could regulate it. Flock has a product that's much easier to use, much cheaper than previous iterations of license plate readers. They have very slick marketing. They rolled their product out really quickly, installed it really quickly, got a lot of users really quickly. I think, following, frankly, a well-hewed model of tech startups scaling up. And I don't think the model of like move fast and break things works well when the thing you're breaking is our constitutional order. Kind of overnight, we've seen these systems proliferate across the country in a way that was unimaginable even just a few years ago. And so courts and legislators have been very, very slow to regulate it. Virginia, only this year, enacted a license plate reader law, and it reduced somewhat the amount of data that the city of Norfolk and other cities in Virginia were collecting. Most of them defaulted to Flock's default of 30 days. And Virginia law now limits data collection to 21 days. It also limits sharing to other Virginia state law enforcement entities. And it limits when police can search this database, but it doesn't require a warrant. And frankly, I think in practice, the protections are just extremely weak and do very little to protect people's Fourth Amendment rights and privacy. [0:13:40] KB: Let's maybe dig in a little bit. Because I think in the tech industry, we're often very familiar with, "Hey, this regulation is outdated. And maybe it's not actually good anymore." Thinking about - and I'm sure Flock is making a case. They're saying, "Hey, this is going to help you keep people more secure, or solve crimes faster, or something like that." How do we as a society kind of make these decisions? What makes, for example, the Fourth Amendment in this case so important around this? How do we think about these types of decisions? Because there are cases where the new technology enables something that was never thought about before, maybe was not even within people's considerations, and that ends up being a good thing. [0:14:23] MS: Sure. Sometimes my opposing council in Norfolk accuses me of wanting to return to 18th-century policing. And I absolutely do not want to do that. But when we do think historically back to the founding, policing really wasn't proactive. Most police officers, constables, worked part-time, generally weren't very well regarded. And a lot of the investigation of crime was done privately. The police would generally get involved only when private parties would go to a judge or a justice of a peace and get a warrant that kind of forced the police to do something. Physical evidence wasn't very important. Often, just witness testimony, that sort of thing. And no one is arguing that we should return to that era, especially in a modern complex society like the one that we have. At the same time, the Fourth Amendment balances two interests. It balances the people's interests in being secure in their persons, houses, papers, and effects with effective law enforcement. And it lets law enforcement do a lot of things, even really, really invasive things, provided that they get a warrant. They have to go to a neutral judge. And that neutral judge stands as kind of a check between really zealous law enforcement and a citizen's right to security and privacy. If they do that, they can break down doors. They can rip up your walls to look for things inside of them. They can search your computer, your file cabinets, whatever. And what the Supreme Court has said is that the Fourth Amendment is meant to prevent a two permeating police surveillance. It's meant to prevent us from living in a society where police can kind of break down doors whenever they want. And no one can be secure in their persons, in their houses, anywhere because there's always the possibility that police are going to come bursting in, that they're going to be tapping your phone lines, that they're going to be tracking you from one location to another. I think it's important to reinforce those concerns and try to preserve a balance that's similar to the level of security and privacy people had at the founding era. The kind of underlying idea is that for law-abiding people, where police have no reason to suspect you of any wrongdoing, of committing any crime, you should be left alone to go about your day without having your movements tracked in a huge police database and have this police dossier built up about you. [0:17:13] KB: Yeah, that makes sense. Thinking about some of the analogies you're making there then, would the eventual state we'd want to land in is none of this surveillance surveillance, but only you can access it if you have a warrant? How are you all thinking about the desired state? [0:17:30] KB: I think generally, if police have no reason to suspect a person of a crime and they don't have a warrant, then they should have to delete data pretty promptly. I would say by the 24-hour mark. In New Hampshire, for instance, license plate reader data has to be deleted within 3 minutes unless there's a reason to retain it, basically. One of the kind of core uses of license plate readers. This has been true for a while, because police cars have had license plate readers on them for quite a long time now. And those license plate readers generally, all that they do is alert if they scan a license plate that's stolen, or that's registered to someone who has an open warrant out for their arrest, or there's like a missing person alert, things like that. Those are called hot list alerts. And that's really just sort of enhancing what the police can see, right? Because a police officer can always go, enter that license plate number, and derive whatever information is in their database. And so that's really just helping the police officer see that there's a stolen car there. But those weren't tracking people across days and weeks. And that's really where we draw the line, is police can always see if someone is at a place at a point in time. It's really when you start following them day after day after day. And you can pick up on their habits, patterns, routines, that sort of thing, that it crosses that threshold. We don't have a problem with these systems sort of flagging stolen cars, flagging missing persons, that sort of thing. And we don't have a problem if police have a warrant to track someone to do that and to collect that information across days and weeks, as authorized by a warrant. But for law-abiding people who are just going about their days, we don't think the police should be hanging on to that information. [0:19:29] KB: Yeah, this sounds very analogous in a lot of ways to the ways that a lot of tech companies have to deal with PII, personal information of different sorts, right? It flows through. It's retained for some time period. Usually driven by some sort of regulation, whether it's like an audit period that you have to keep it available, or maybe it's a contract you have with the customer, what have you. And then, unless there's a flag that somebody says, "Okay, I need you to investigate this," it's automatically deleted and rolled off, and that sort of thing. Now, I know in the EU, there's a GDPR that regulates very, very broadly. And I know there's a lot of hate for it, actually, in the tech industry because it is a very blunt hammer. It is a very blunt tool. But it would seem to head off issues like this because this is personal data in some form. And so you would have to get consent. You'd have to explain how it's being used. All these different layers. So, are there any similarly proactive measures that we could take in the US to kind of get ahead of this never-ending race of technology? [0:20:33] MS: I think we could. The GDPR doesn't apply to law enforcement activity. Of course, the license plate readers used by law enforcement wouldn't be covered. But that just goes to the extensive regulation there is for private companies' collection of data, at least in some places, in Europe, in California, but very, very minimal regulation when the government is collecting kind of similar information for its own means. And often, frankly, not guarding that information even as well as tech companies might. Because what you see in place after place is that the data governments are collecting about people, about their movements, among other things, is just pretty freely available to police officers. And there often aren't great protections for it. Whether it's within the department in terms of policy, whether it's a warrant requirement, whether it's a data minimization requirement, whether it's access restrictions, that sort of thing. You just don't see those. And oftentimes, even the kind of technological protections for these data are pretty minimal. Senator Ron Wyden sent a letter to the FTC just the other day about Flock not enforcing multifactor authentication for its users. They make it available, but they don't require police departments to actually use it. [0:22:05] KB: Yeah, you don't have to dislike police use of this even necessarily to be concerned about that, right? It takes a single bad actor or a single naive policeman falling for a fishing attack or what have you. And all of a sudden, anyone's historical movement data is available. [0:22:23] MS: Yeah, absolutely. I mean, what we've seen is police departments where officers will share passwords with federal officers, for instance, and things like that. And that's possible if you don't have multifactor authentication. [0:22:38] KB: Yeah. You've talked about how slow it is to get something to the Supreme Court. And you mentioned the example of 8 years to get this view into it for cell tracking data and GPS. What other mechanisms are being pursued? Are there things that different state governments are doing around this? What are all the different ways that this is being looked at? [0:23:01] MS: Sure. Now, many state governments are coming around realizing that this is a huge issue, that their constituents really want some guardrails on police use of these databases, and they've started acting. I've mentioned in Virginia, for instance, there was legislation passed earlier this year. Many other states are looking at similar legislation. Many city councils have started to put the brakes on these programs. What we often see is police departments sort of acting like an unaccountable fourth branch of government and going out and just buying massive surveillance systems and installing them without any real oversight by the city council. Because many of these systems have been purchased with outside grant money. In Norfolk, for instance, federal relief funds were used to purchase the vast majority of the cameras. So, that didn't have to go through the city council. There really isn't a Democratic check on this sort of thing. We're seeing right now a fight play out in Denver over whether they'll continue to have license plate readers, where there seems to be a disagreement between the city council and the mayor, and they're sort of fighting about whether to keep them up. At IJ, we're approaching this from multiple different angles. We've talked to lots of different stakeholders. We're calling it the Plate Privacy Project, PPP. Everyone loves alliteration, where we have a model bill, which is called the PEEPS Act. I won't remember off the top of my head what that's an acronym for, but it was very important that it'd be PEEPS. We've approached it. We've engaged with legislators. We've engaged with activists. And there are more routes than just the judicial angle, but we're obviously pursuing that, too. In the Norfolk case, we defeated a motion to dismiss earlier this year, meaning our case could proceed to discovery. We got through discovery at the beginning of September, and both parties have filed motions for summary judgement, which is essentially those motions ask the court to resolve the case without a trial. And depending on how the court resolves those, we may have a trial in February of next year. That does seem like a long time considering we started last October. But this court is known around the country as the Rocket Docket. And so I can tell you that's lightning fast for federal litigation. We're certainly trying to push this case along. [0:25:38] KB: Well, and this is the thing I just keep grappling with, right? Because timelines in the tech industry are bananas. And we're talking right now about Flock cameras, but I'm also thinking about things like facial recognition technologies, different types of AI agents correlating data and doing different things. Are there any mechanisms that move faster than the legal system on this? How do we get ahead of this? [0:26:01] MS: It would be with legislation and through legislation that anticipates future technologies and is generally applicable. The judicial process, generally it should result in rules of general applicability. But courts have been pretty slow and have sort of considered each technology on its own. We think the principles from the case I mentioned about cell site location information and the case I mentioned about aerial surveillance generalize to license plate readers. Obviously, the city of Norfolk and Flock disagree with that. They're litigating very vigorously on the other side. And the outcome is, to some extent, uncertain. But those judicial opinions should apply more broadly. The problem is that none of that will move as fast as the tech industry does. And like you pointed out, there are other technologies coming into use. I mean, not just facial recognition, but Flock has video cameras apart from its license plate readers that will look for certain features like a green hat or red shirt and try to find people across many different cameras that way. Rolling out technologies like gunshot detectors that listen for screams and that sort of thing. We've seen cities install extensive systems of facial recognition cameras from Detroit, to Philadelphia, to New Orleans. And various other types of surveillance of people's public movements kind of becoming a reality. I think Flock is really kind of the canary in the coal mine. That's not quite what that phrase means, but I can't resist a bird pun when it comes to Flock. Where license plate readers were a really kind of accessible form of surveillance, I guess, where it's pretty easy to recognize the characters on a license plate and to use that as a proxy to track someone across times and locations, whereas facial recognition and these other technologies have been much more crude. But what we're seeing is just increasing government surveillance at every level of government, and kind of entering the age of big data government surveillance. The question is whether our legal system and our legislatures are going to anticipate that and react to it and put guardrails on it, or just adhere to these really, really outdated concepts that are based on much more archaic technologies that collected much less data. [0:28:45] KB: A thing I'd love to then explore with you, and looking at the legislature, if you look at the types of professions that lead into becoming a legislator, things like lawyers, or policy analysts, or military, or sometimes teachers, you very rarely get a technologist in there for multiple reasons. I know I spent 3 years on a city panel, and it bored me to tears because I'm used to things moving quickly and being resultsoriented. And a lot of government is people who like to talk talking to each, which we need. No judgment on that. But that is not me. And I think a lot of technologists feel like they wouldn't want to be there. How do we as an industry who maybe is able to see a little bit more where things are going, how do people in this industry influence that development of proactive legislation or other things that might cut this off? Because, I mean, I look at this and I'm terrified, right? And people in this industry often are familiar with China. And you go there, and you are tracked everywhere. And everything you do is tracked. And personally, that, once again, sounds terrifying. I wouldn't want anything to do with that. So how do we start engaging as an industry in this process? [0:29:57] MS: That's a great question. I don't know that I have great answers. Government is frustrating sometimes. Government moves slowly. Here at IJ, we sue the government a whole lot. [0:30:08] KB: So, that's one thing you can do. We can't do that. But one thing you're doing is just suing them. Okay. [0:30:12] MS: Well, frankly, if you want to be a plaintiff in a public interest lawsuit, well, you can reach out to IJ or you can reach out to other organizations that certainly would be happy to represent you. And I do think the perspectives of people in the tech industry as parties, as plaintiffs in public interest litigation would really be valued because sometimes people assume that tech likes tech and everyone kind of favors the roll out of new tech that's tracking you everywhere you go. And people in the tech industry don't actually care about that. [0:30:46] KB: I will say the people I know who are the most paranoid around things like web tracking, search engines, what's available online are people who have worked in search engines or other things. They know, and do not like it. But okay, I want to go on that a little bit more because you say that, okay, you could be - actually, what are the words you use? Be the - [0:31:06] MS: A plaintiff in public interest litigation. [0:31:07] KB: Okay, a plaintiff in public interest litigation. I have no idea what that actually entails. I know high-level that's like, "Okay, the case is on my behalf in some way." But what does that mean? What would somebody actually go through? What becomes public about them if they choose to do that? [0:31:22] MS: Yeah. That's the crazy thing is that there are tradeoffs between maintaining your privacy and bringing these cases publicly. And I'm hugely grateful to our Norfolk clients who are very private people but have put themselves out there for public examination. Now we've been able to protect their privacy to a great extent. The defendants in our case actually asked for all sorts of invasive discovery about them. So it wanted all of their social media, all of their cells site location information, it wanted them actually to keep a log of everywhere they went and everywhere they saw for a period of 5 months. And the court said no to all of that. The court thought that that was ridiculous and irrelevant. So thankfully, we were able to preserve that. I mean, you do come in for pretty invasive discovery as a plaintiff. At the same time, you get to have your views known. You get to make it clear that you find this form of surveillance disturbing, unsettling. But I'll say, you don't have to become a party to a case. You don't have to sue a city or a police department, whatever it is, to make your views known. There are lots of other ways to influence public policy. Even in litigation, you can file amicus briefs. [0:32:39] KB: What is that? For all of us who are not legal folks. [0:32:41] MS: Sure. An amicus curiae is a friend of the court. And you can file a brief that sort of gives the court your distinctive point of view. And in many cases, that involve technology. Frankly, companies and people who work in the industry will file briefs with the court to kind of give them better perspective on technology. In these cases that involve privacy, we'll oftentimes hear from privacy scholars. And those things are very helpful to courts, frankly, which might not have a nuanced understanding of privacy research, of technology, and where the parties might not have that level of understanding in many cases. [0:33:25] KB: Well, and that gets to I think one of my core frustrations in general with regulation around technology. If you read any of it, you're like this was clearly written by people who have no clue about how this actually works. Being able to share knowledge of like, "Hey, I've been deep diving in this industry for 20 years, or whatever it is. And let me actually try to explain it in that way," seems valuable. How does that - do you have like a request out? In our world, there's like request for comments? Is there a request for amicus briefs of like, "Oh, we're in this case. We need this set of expert opinions?" Or is it something individuals go and find cases that are ongoing? Or how does that work? [0:34:04] MS: It kind of works both ways. I'll say in litigation, just generally speaking, the parties will often reach out to people that they want to weigh in. It can be for any set of cases, like a pretty known set of characters. It can be looking for other people. Sometimes people reach out in cases they're interested in and ask the parties like, "Hey, can I file a brief that expresses this view or provides this information or that sort of thing?" And obviously, I should just put out there that the litigation process isn't the only way to influence regulation of these things. You can also talk to your state legislators. You can also talk to local leaders, city council members. You can talk to the media. You can write op-eds. You can do all sorts of things that influence public policy without having to enter the fray in a lawsuit. [0:34:58] KB: For sure. And I think folks tend to underestimate how impactful they can be locally. Having dabbled a little bit in local stuff, if you show up to a city council meeting - I live in Mountain View. There's 100,000 people who live here or so. I'll show up, and I'm like one of 20 people. If I give comment, that is substantial. People notice. Sometimes there's contentious things and there's many more people. But your ability to impact a local legislature is large. [0:35:22] MS: Yeah. I mean, oftentimes, people don't even know what's going on with their local legislators or even their state legislators. And it can be difficult for people to figure out how to get their comments in. You may not be able to appear by Zoom. It might be during hours when you have to be working. It's very hard. And so when you're interested in an issue, just reaching out can often have an outsized impact, especially at the local and state level where they may not at times be hearing as much from people. And oftentimes, when it comes to policing issues and surveillance issues, on the one side, you have police who feel very strongly about this. You might have a private company that has skin in the game and is lobbying really hard. And then on the other side of the balance, you might have no one, because it's kind of an intangible right. And people might not realize how much it's slipping, may not realize that they need to show up, and may not realize how tilted the balance in the legislative process is towards police and private industry. [0:36:36] KB: If someone wanted to know, "What is my city's take on this? Or what is my state doing here?" I'll use myself as an example. I'm in Mountain View, California. I want to know what is my city council saying about Flock cameras, or license plate readers, or what have you. Where would I go? [0:36:52] MS: If you Google Plate Privacy Project, IJ has a website that gathers a lot of data about Flock cameras in various localities. We've gathered a lot of information from places that have Flock transparency portals. We also link to a website called DeFlock. That's deflock.me, which shows locations of Flock cameras. And those are all great ways to find out whether you have the cameras in your community, where they are. It can be hard to figure out if there is action at any given point in time. The best you can really do is Google it, search the news, what have you. [0:37:37] KB: Oh, I'm feeling good now. I just looked, and my city is a blank spot and surrounded by bubbles. All right, good job, Mountain View City Council, or something. [0:37:47] MS: Yeah. Yeah, Mountain View, I guess. Unless they're just not mapped there. But maybe Mountain View cares about your privacy a lot more. Ironically, as we've seen people - well, I'll put it this way. Oftentimes, we hear from Flock that people are afraid of what they don't understand. And if they understand the technology more, they'll be less worried about it. And what I've seen is just the opposite is true. Oftentimes, these systems rolled out, like I said, without city council oversight, without much fanfare, the cameras just go up and they can be hard to notice. They're pretty discreet. Many people might miss them. But as people have become aware of them, there has been huge backlash in many places. And many communities remove their cameras. And it's always surprising to me seeing some small town way out there that had a car break in six years ago installing five, six, seven Flock cameras. And people rightly question if that's even necessary. And if there's sufficient offsetting benefit to give up their privacy in that way and to have their movements logged in this enormous database. [0:39:01] KB: Yeah. No, it is fascinating. And I am going to recommend this site. Because I'm looking at it as we're talking. There are a couple cameras I found somewhere in my city. They do appear to all be in intersections that actually could well be just traffic. They're bad intersections. They're doing traffic light monitoring. I don't know. Yeah, you can dig in. Some cities look like you can just see the density, and it's wild. [0:39:25] MS: Yeah, in many places, it's really crazy. In Norfolk, I mean, they have 176 cameras now. And they're concentrated along main thoroughfares at intersections. And so, if you're trying to get anywhere in the city, you will pass those cameras. You will be at an intersection where there's an array of four cameras that can tell which way you turned and which direction you're going in. We had an expert who's a professor of applied mathematics create a simulation of 15,000 realistic routes across the city and try to assess what you could tell about those routes using the cameras. And lots of routes, the majority of routes will pass at least two cameras such that you can connect the dots and pretty much figure out the route a person took. Some routes, you could reconstruct almost the entirety of the route. And there were lots of areas within Norfolk where many of people's trips could be reconstructed in that way to a really disturbing extent. [0:40:33] KB: Yeah. No, this is fascinating. And I am already imagining even just showing up to your city council and asking, "What are your policies on this? What is going on with this?" Because I bet a lot of the city council members have not thought very much about it. [0:40:46] MS: Yeah. I mean, there's often very little oversight. The police department might come up with its own policy. That policy may or may not be posted somewhere that's publicly available. And the city council might not have much oversight over it. And I think it's time for people to ask hard questions about who has access. Do we need every single officer to have access? Do we need them to have access to all of the data in the database to do their jobs? What are we telling them about when they can search the database? How much guidance are you giving to an officer? Just saying, "Oh, only use it for law enforcement purposes." Is it really a lot of guidance and isn't a great limit on when officers can use it? How much information is it collecting? How long is it being kept? How is it being used? And then a really important question is what oversight is there? To give you the example, in Norfolk, we found out that they didn't conduct any audits for two years. And so whenever there's a scandal involving officers searching Flock's database for impermissible reasons, Flock always points to the fact that, "Oh, well, we give all of our customers access to this audit tool that they can use. And it's really their job to oversee how their officers are using the system." And all that means is you can download a spreadsheet that gives you the reasons that your officers are searching the system. But someone actually has to look at that, and someone actually has to audit it. And what we found out in Norfolk is that no one audited it for two years. And even then, in many cities, the vast majority of the reasons that officers will give for their searches, as has been seen in many, many different news articles, are extremely vague. They're things like law enforcement, investigation, that sort of thing. Virginia now requires slightly more detailed rationales. But unless someone is really rigorously auditing the data, not just doing kind of a box checking exercise of, "Did you put a reason? Does it pass the smell test?" there's going to be so much misuse of these databases that's just kind of flying under the radar. The example I always give is there were two police officers in the Wichita, Kansas area who were caught using Flock databases to stalk their exes. And one of them, he was a police chief actually. If you look at the reasons that he put for his searches, they all look facially legitimate. There are things like missing person, drug investigation, that sort of thing. And so unless someone is really taking a hard look and spotting that this officer is doing a lot of searches, they're for this specific license plate, and digging into what that license plate is, you're not going to catch that sort of misconduct. And I just don't think there is that level of oversight anywhere, frankly, over these data. [0:43:51] KB: Well, that kind of gets me to another question related to this, which is one of the things that's standing out to me through this conversation is this is likely to, at least for the foreseeable future, be a very locality by locality type of situation. The pace at which anything is going to be decided at a federal level is going to be slow. It is likely to be overly specific, apply to, - for example, maybe it just applies to Flock cameras. But then the next generation of technology, it changes. If folks are looking at this at a local level and saying, "Hey, I'm willing to go and make a stink at the city council," which I cannot advocate enough because I have done that. I have been that person. City council, school boards, they actually kind of listen to you because they don't get yelled at by very many people, unlike Congress people who are used to being yelled at. But is there a model to point to where you can say, "Oh, you know what? This is what responsible use of these tools, looks like?" Because I mean, I live in neighborhoods that there's a lot of people who are safety concerned. They might say, "Yeah, we want responsible use. We don't want to get rid of this stuff, but we do want an audit trail. We want observation. We want something that makes sure they're being used for the right purpose." Is there a model out there, like a municipality or state, that's doing this well? That somebody could say, "Hey, you don't need to reinvent the wheel. Just go look at them." [0:45:06] MS: I love New Hampshire, which requires deletion within 3 minutes, unless there's a hot list alert. That's probably not feasible all across the country. And there are valid and helpful uses of these license plate readers to kind of flag stolen cars. Or in the wake of a robbery, to be able to go look up an image quickly and get a lead that way. And the point is really to balance that against the invasion of ordinary law-abiding people's privacy. So, at IJ, on our Plate Privacy Project website, I referenced it earlier. We have a model bill called the PEEPS Act. And the acronym is Protecting Everyone from Excessive Police Surveillance. It's primarily intended to limit the use of license plate readers, but it's a little bit broader than that. And it does cover other police surveillance technologies and would iterate over time as new technologies come online. But it has a lot of the components that we've been talking about; data minimization, warrant requirements, access restrictions, audit requirements, genuine audit requirements. Having someone actually look at the data skeptically. And disciplinary requirements for officers who violate the law. [0:46:31] KB: Yeah, I'm looking at it now. Scanning this, it looks like it's largely aimed as a model bill for state-level legislators. But one could point to that if you had a city that was interested or something like that as well. [0:46:42] MS: Of course. Yes. [0:46:45] KB: Fascinating. Well, certainly, I now have a bee in my bonnet. I want to go and talk to some of the folks in my city about this. Are there any things we haven't talked to about this that you think would be important to talk about before we wrap up today? [0:46:59] MS: I don't think so. But I just underscore that license plate readers are really just the start of a new era of big police surveillance. Police using all sorts of tools, all sorts of data to build massive dossiers about people just in case they need them, right? And we've seen in recent years police buying data from data brokers, police maybe even using data from data leaks, police investing in surveillance technology. And that's just kind of inconsistent with this country's tradition of limiting police surveillance, of letting people live their lives free from the suffocating atmosphere of police surveillance if they're not doing anything wrong. And the idea that police would be hanging on to information just in case is totally inconsistent with our constitutional order. And I would just warn people, a lot of the attention right now and rightly so is on the misuse of license plate reader data, of other surveillance data, for oppressive immigration enforcement, or to go after women who may be going out of state to get abortions, that sort of thing. To me, those are all the symptoms, not the disease. The disease is just having this massive government surveillance. And the real way to prevent this information from being abused is not to have it in the first place, and not just to trust that the government right now will not abuse it. Because as we see, these systems are set up with good intentions usually, but they're always susceptible to future abuse. [0:48:48] KB: Well, and to point to some of the examples you said before, you don't have to mistrust the government writ large. You just have to believe that there can be bad actors. [0:48:57] MS: Absolutely. [0:48:58] KB: Because as you highlighted, a lot of the constraints that are already there for tech companies don't apply to government. A lot of the best practices that certainly every tech company I've ever worked at have thought about in terms of data access on all those different pieces may not be there. If you look at a Google, or a cell phone company, or someone else that has these massive amounts of data, they also have all these different internal policies and things to prevent bad use of data. You have double opt-in. Why do you need to do this? Who actually has access to what? Audit trails, all those other things. Even if you do think that, "Hey, police should have this type of stuff," they should have at least the same level of safeguards that we expect companies to put in place. [0:49:38] MS: Of course. And they don't have anywhere near that. And the government is terrible at ensuring police accountability. One of the other areas IJ works in, we have a project on immunity and accountability, which seeks to get rid of all of the various doctrines that protect police and other government officials from having to face the consequences of their actions in the ways that any private party would. Really, the question of who polices the police comes up. And oftentimes, it's no one. [0:50:14] KB: That actually sounds like a great close. It's a good wrap. [END]