Law enforcement’s ability to track and profile political protestors has become increasingly multifaceted and technology driven. In this edition of Incognito Mode WIRED Senior Editor, Security & Investigations Andrew Couts and WIRED Senior Writer Lily Hay Newman discuss the technologies used by law enforcement that put citizens' privacy at risk—and how to avoid them.Shop for products discussed in this episode of Incognito Mode:Silent Pocket SLNT Faraday Waterproof Backpack: https://amzn.to/42ePTOgVisit the SLNT Storefront: https://amzn.to/4cA8ub9When you buy something through our affiliate links, we earn a commissionDirector: Efrat KashaiDirector of Photography: Brad WickhamEditor: Matthew ColbyHost: Andrew CoutsGuest: Lily NewmanLine Producer: Joseph BuscemiAssociate Producer: Paul GulyasProduction Manager: Peter BrunetteProduction Coordinator: Rhyan LarkCamera Operator: Mar AlfonsoGaffer: Niklas MollerSound Mixer: Sean PaulsenProduction Assistant: Malaia SimmsPost Production Supervisor: Christian OlguinSupervising Editor: Erica DeLeoAssistant Editor: Justin Symonds
Category
🤖
TechTranscript
00:00Protests, almost by definition, are points of contention between citizens and their governments.
00:06Police tracking of protesters is multifaceted and includes a variety of tactics and gear
00:10that generate different data.
00:11Some surveillance is done at the protests, while other methods are used outside of it.
00:15It's just like all different ways to get at this core thing of who was there, what
00:21are they up to, what do they think about things.
00:24I think that's sort of how I break it down because so many of these technologies are
00:28unseen or not intuitive.
00:31In this episode, we'll discuss the technologies used by law enforcement that put citizens'
00:35privacy at risk.
00:36This is incognito mode.
00:44The movies were way ahead on this, right?
00:47Like they were depicting the yellow box that goes around the face type of thing.
00:52Now that is very real.
00:55This technology is more and more available to law enforcement.
00:58Although law enforcement have had access to facial recognition tools for about 20 years,
01:02they previously were only able to search government images such as mug shots.
01:06This changed in 2018 when many police departments started using Clearview AI, a facial recognition
01:12app that allows them to match photos from around the web.
01:14Once a photo is uploaded, the app pulls up matches found online along with links to the
01:18source of those photos.
01:20Clearview says more than 600 law enforcement agencies across the country use this software.
01:26Based on the person's facial geometry, the images are converted by the system into a
01:30formula, measuring things like eye distance.
01:32This means that law enforcement can use any image to search for a person who doesn't currently
01:36have a police record and isn't known to authorities, and potentially identify them in seconds.
01:40I wanted to ask you, since you've covered this a lot, how do you view the risk of these platforms
01:47as they proliferate?
01:49To be quite frank, it freaks me the hell out.
01:51Image recognition is just really, really good now and cheaper to deploy.
01:56And so, you know, I think it's more just kind of accepting that this is just part of life,
02:01like just commuting every day.
02:02You're probably being subjected to some of these systems in one form or another.
02:06It's not just the systems where you have face wreck built in, it can be deployed after
02:09the fact.
02:10If you're in people's pictures that are posted on social media, it can get uploaded to these
02:14systems and then you can get picked out of a crowd in that way.
02:17USA!
02:18USA!
02:19USA!
02:20We saw that with, you know, the January 6th insurrection, videos that were posted to
02:23Parler and other social media platforms.
02:25News tonight, an Auburn man has been found guilty of federal charges for his actions during
02:29the January 6th insurrection.
02:31You know, the FBI took those, they saw people in the videos, they went back and kind of looked
02:35to see like, okay, here's proof you were there.
02:38Governments in 78 countries use public facial recognition systems with varying degrees of
02:42support from their citizens.
02:44Many countries use the technology without transparent regulations.
02:47In Russia, facial recognition tools have been used not only to detain people protesting
02:51the war in Ukraine, but also to identify and arrest opponents of the government before
02:55they joined any demonstrations.
02:57Reuters reported that the facial recognition systems used in Moscow are powered by Western
03:02companies, including Nvidia and Intel.
03:05Other companies such as Amazon have also launched software that allows users to build a facial
03:10recognition database using their own photos.
03:12These systems, they're everywhere and things that you might think could kind of thwart these
03:17systems, even like wearing a mask and these kinds of things.
03:19Some of the technologies can get around that.
03:21I don't know what to do with that information, to be honest.
03:24There are a lot of police here. Are you not frightened?
03:28We are, but, you know, we are together. That gives us real power.
03:33I am frightened. Of course I'm frightened. That's why I'm just covering up all my face
03:37just so that they cannot even, you know, find my ID. But me being afraid doesn't mean that
03:43I'm not going to be here today and fight for my future.
03:46I agree 100% with what you were saying about how masks and other deterrent measures aren't
03:53always effective at defeating these identification technologies, but clearly they are at least
04:01somewhat effective sometimes because, you know, in a lot of crackdowns we've seen in the last few years
04:07by multiple governments, like one thing they'll do is try to ban mask wearing in certain settings.
04:13Yeah. Are there any other things? Please tell me that you have more.
04:17Yeah. I mean, I think there are ways to minimize the data and thus minimize the risks.
04:24Just simple things like not shooting pictures and videos while you're at a protest so you're not
04:29capturing yourself and anybody else who's around you is one way to keep it out of some types of systems.
04:35Avoiding some systems is better than avoiding no systems. You are going to be subjected to this
04:40technology in one way or the other and you just kind of have to proceed as best you can and minimize
04:47your contributions to those systems as much as possible. CCTVs or security cameras have been ubiquitous
04:55for a few decades now. One could have thought 20 or 30 years ago, like, well, now everything is going to
05:02be captured on film all the time. But there are limitations still to just how much data is stored,
05:11for how long. You know, there have been a lot of high profile events around the world in recent years
05:17where there wasn't adequate security footage to really know what had happened. It's not like every step
05:23you take. Someone is paying to run the system and store the data to identify you.
05:32In 2010, Wired reported on federal agents friending crime suspects on sites like Myspace in order to
05:38see their photos, communications, and personal relationships. More recently, police have used
05:42companies like Data Miner to more easily sift through massive amounts of data in order to glean
05:47information about how protests are organized, to identify activists, and to piece together people's
05:52connections to each other. So social media accounts, right, it's a lot of data on everyone who's using
05:59these platforms. But I kind of think of these surveillance technologies in two buckets. One would
06:07be if authorities want to find out more about a specific person, right? What has Andrew been posting
06:16about or saying and are there photos, you know, of Andrew online, things like that. But then the other
06:24one would be coming out at the flipped where it's like they're looking for anyone who has been talking
06:31about x thing or, you know, anyone marking their location in a certain place on a certain day.
06:38Authorities can go directly to the sites or they might want to use a service that kind of pulls a ton of
06:46data from social platforms together, you know, aggregates all of it and getting kind of lists of
06:52names. It gives the ability to like have this vibe check. Like those platforms themselves aren't
06:58inherently a surveillance tool, right? Sometimes we use them for journalism. I've used some of these
07:03services like Data Miner before and once you see just the fire hose of information that you can get
07:09access to when you use it, it becomes clear just how easy it is to kind of figure out what is going on, even if
07:15it's not obvious to you and your own like curated timeline. Just the use of them has become more
07:20widespread. You wouldn't know without doing some investigating definitely my local police department
07:25is using this or not. That creates an environment where you have to assume that that's what's happening.
07:29Steps like making your account private or setting something to expire quickly, maybe they can help but
07:37I wouldn't assume those types of settings can really truly protect data on big mainstream platforms.
07:47An example of how social media surveillance was used can be found through the MPD surveillance of
07:51the George Floyd protests in 2020. It was found that the MPD collected data about protest events,
07:56including dates, locations, organizers, and estimated crowd sizes. The MPD shared this information with the
08:02Secret Service, National Park Service, and the Department of Defense. So I think the other huge
08:08advice is about data minimization and not posting about things that you worry about getting into other
08:19people's hands. There's a tension here with chilling speech, right? The nature of the internet is to share
08:25information, right? That's like the whole purpose of the platform. When you put stuff out there,
08:30it's hard to say like, okay, it's out there, but only for certain people and control it. Our perspective
08:35on it is probably a little bit different because we're journalists. We're kind of in the public eye
08:39in a way that some other people aren't. But I think anybody, no matter if you have one follower or a
08:44million, you should be really careful about what you post online and when you post it online. You know,
08:50if you're going to post vacation pictures, I never post them while I'm actually on vacation because then
08:55that signal to somebody like, hey, my house is empty. You can apply that to all different types of
09:00risks. And I think generally posting less is the way to go. But also some people really want to post
09:06or that's their like job or, you know, that's their, how they make money. It's just helpful to
09:12understand that the greater volume you're posting, the more there could be things you didn't think of
09:18that's exposing information that you didn't realize is now out there.
09:22MC catchers, also known as cell site simulators and formerly referred to as stingrays, are devices
09:30that impersonate cell towers, causing cell phones within a certain radius to connect to them.
09:34Initially designed for military and national security purposes, this technology has emerged
09:38in routine police use. Until recently, the use of MC catchers was withheld from the public. The FBI has
09:44even forced state and local police agencies to sign NDAs in order to use their devices.
09:47I mean, I find MC catchers fascinating just in that their, their use is really secretive. Like,
09:54there was a long time that police weren't allowed to say that they had them or that they were using
09:57them. So there's just this. And no one had seen one. Right. Yeah, exactly. Can you tell us just a little bit
10:03about what how that works? These are devices that at its core, just identify that your phone was
10:11physically in a certain location, like that's the baseline thing it's trying to achieve, sometimes called
10:16an MC catcher because of this IMSI number that it's trying to pick up. They can work in different ways.
10:23They can work passively to just sort of sweep around and say what devices are in the area and let me try to,
10:29you know, decrypt their signal and catch that, you know, an ID number. More often they're, they work
10:35actively as like a fake cell tower, taking advantage of the way the system works that your phone is going
10:43to connect to the cell tower that's emitting the strongest signal in the area to give you the best
10:48service and then grab that ID number. Sometimes they can also potentially grab other stuff like
10:58unencrypted communications, like SMS text messages. It's important to know that one of the things that
11:05can happen when you bring a phone to an event like a protest is that the fact that you were there and
11:12potentially some other information could be sort of pulled out of the air by one of these devices.
11:17Records show that MC catchers are used by 23 states in the District of Columbia, the DEA, ICE, FBI,
11:24NSA, and DHS, along with many additional agencies. In terms of how people gauge the risk of these,
11:30I mean, for one thing, like you said, a lot of times they're looking to target one person or maybe a
11:36couple of people. And it does end up looping in a lot of people just by the nature of how it works. But
11:41it's also one that I think is expensive and complicated to deploy. And so it's probably not
11:46going to be the top concern. If I were going to a protest, I don't think it's the thing I would be
11:51so concerned about just as an average person. Another thing in that vein, you know, if this
11:57technology that we're talking about is rogue cell towers, it means that actual cell towers also have
12:03all this information, right? Like your wireless provider knows where you go. So that data exists
12:10anyway. And there are potentially other ways that, you know, authorities can get that information.
12:18Geofence warrants or reverse location warrants allow law enforcement to request location data
12:23from apps or tech companies like Google or Apple for all devices in a specific area during a set time.
12:29Authorities can then track locations, identify users and collect additional data like social media
12:33accounts. This is yet another layer in this multiple approaches to getting the same information. Who
12:42was at a certain place at a certain time and what can we find out about what they were up to? A lot of
12:48it's advertising data or what's being shared all the time from your device that you probably aren't paying
12:53much attention to and is used in a much more innocuous way typically. And it's sort of
12:58slurping up all the data from this area which is constrained in a way but doesn't account for
13:06passers-by, people, you know, getting coffee at the deli next door, people just sort of coming up to a
13:13location to see what's going on. Like this is just bulk indiscriminate data. I am worried about it but
13:20maybe not specifically. Like it's in the category to me of all the reasons that I might consider leaving a
13:27device at home or putting it in a Faraday bag. It's sort of just on that list of reasons that
13:33you might want to minimize the data that your device is emitting.
13:41Data brokers collect and sell personal data from public sources, websites, and apps people use
13:46every day. They aggregate all this info to build detailed profiles of people and to group them into
13:50simplified categories such as high income, new moms, pet owners, impulse buyers, and more. While
13:55advertisers are usually their primary clients, police can also purchase this data. Some of the
14:00largest data broker companies include Experian, Axiom, and Equifax. The amount of data Equifax collected
14:06came to light in 2017 when a data breach exposed 147 million people's personal data.
14:11I think it just fuels this ability to identify someone and track kind of their behavior across
14:20the web and potentially their speech. Similar to the way law enforcement can track people and surveil
14:27people through social media platforms, information from data brokers can aid investigations
14:32in two ways. They can be coming at it from a person of interest who they're trying to find out more
14:39about or authorities can be coming at it from, I want information on anyone who has had an IP
14:47address in this area or anyone who has keywords searched, you know, for and been shown these types
14:55of ads. So how do data brokers collect information? The most common ways include web browsing history,
15:00everything from your Google searches, sites or apps you visit, cookies, social media activity,
15:04or even a quiz you just filled out for fun. All of that can be scraped and tracked. This data
15:08creates each person's online history map, which in turn allows brokers to build a profile on each
15:12user. The data that companies collect often include name, address, phone number and email address,
15:17date of birth, gender, marital and family status, social security number, education, profession,
15:23income level, cars and real estate you own. It also comes from public sources. This can be anything
15:27in the public domain, such as birth certificates, drivers or marriage licenses, court or bankruptcy
15:32records, DMV records, and voter registration information. It can also include commercial sources,
15:37such as your purchase history, loyalty cards, coupon use, and so forth. And finally,
15:42some websites or programs will ask for your consent to share your data. Sometimes it's anonymized in
15:47certain ways, especially when it comes to advertising data, but that's, it's pretty trivial for law
15:52enforcement or other investigators to tie certain advertising behavior to a specific device, especially
15:59if it's collecting precise location data. And there's also data brokers that are building network
16:04profiles. So you can not just get information about yourself, but everybody you've interacted with,
16:09whether it's on social media or actually in real life, in the United States, at least, we just lack
16:14laws that kind of regulate what these companies are able to collect. And if you have to participate in
16:20modern society, as nearly everyone does, it's almost impossible to avoid. I think in the context of
16:27protests, it's not an acute concern, I would say, but it is generally speaking, really freaky when the
16:34sky's the limit on what they could potentially use because there's just so much data. I agree with what
16:38you said, sort of low on the acute scale, but high on the existential scale.
16:46One of the big surveillance technologies that probably everyone who's driven on the highway knows about
16:51is license plate readers, really just capturing what your license plate is and showing that your
16:56vehicle was at a certain place at a certain time. Similar to like your phone, your car, it's a proxy
17:03for you. Maybe you were in the car, maybe you weren't, but that's where your car went. There are three
17:08types of ALPR systems, stationary or fixed ALPR cameras, which are installed in a fixed location,
17:14like a traffic light, telephone pole, or a freeway exit ramp. The second type are mobile ALPR cameras,
17:20which are attached to police patrol cars, garbage trucks, and other vehicles, and allow them to
17:24capture data from license plates as they drive around the city. They can also assist law enforcement
17:29in gridding, which is when police officers drive up and down a neighborhood collecting license plates
17:33of all parked cars. There are also private vendors like Vigilant Solutions, which collect license plate
17:38data and sell that back to police. The third type are ALPR trailers, which are trailers police can tow to
17:44a particular area and leave for extended periods of time. It's been reported that the DEA has disguised
17:49ALPR trailers as speed enforcement vehicles and placed them along the US-Mexico border.
17:54The things I'm concerned about aren't necessarily even it being used for license plates. Our colleague
17:59Dhruv Mehrotra has done some reporting showing that license plate readers can also capture any
18:04words that are visible. So that can be what's on your t-shirt, that could be political signs in your yard.
18:10This technology may be able to be used in ways that we're not even familiar with or would imagine.
18:15You know, a lot of times when we're talking about any surveillance technologies, it's really about
18:20creating data that then is there and could potentially be used in any number of ways at any
18:26point in the future, depending on who gets access to it and what they want to do with it.
18:33The key thing here is that these drones, even small quadcopters, like what we think of as consumer drones,
18:39they can carry a fair amount of cargo, meaning like cameras.
18:45There are a number of different drones used by law enforcement, varying in size and ability.
18:49For example, some drones have thermal imaging capabilities for night operations,
18:53while others specialize in long periods of surveillance. Protesters have in the past
18:57reported drones flying overhead, for example in Minneapolis during the George Floyd protests.
19:01Police and government drones usually fly in the range of 11,200 feet above the ground.
19:06However, it's been reported that the drone used to surveil protests in Minneapolis in 2020,
19:10flew at 20,000 feet, nearly invisible to protesters on the ground. This was a customs and border
19:15protection drone, which are often equipped with advanced cameras, radar, and potential cell phone
19:20geolocation tools. In terms of how freaked out are you about drones, how do you think about that?
19:25Yeah, I would say fairly freaked out. But again, like you were saying about the layering of these
19:30technologies, I think it's not the drones themselves, it's everything they can do and how cheap they are,
19:39and how easy it would be to deploy even more of this tech. When we talk about sort of evolution of
19:46different technologies, this capability is sort of similar to police helicopters. And now it's just
19:53cheaper, lighter, easier. Even these sort of benign seeming quadcopters that we see around all the
20:00time could be carrying equipment on them to do like very granular detailed surveillance of something like
20:07a protest. There are some technologies that are really just emerging and we don't even know if they've
20:14been used at protests or even used by authorities in the United States. Right, and your face isn't the
20:19only thing sort of outside your body that can potentially identify you. For example, analyzing
20:26your gait, like how you walk. Gate recognition technology can identify individuals by analyzing
20:31their unique walking patterns using machine learning. It captures movements through cameras, motion sensors,
20:37or even radar. It then processes this information, breaking it down into contours, silhouettes, and other
20:43distinguishing features. It offers high accuracy, but its effectiveness can be influenced by things like
20:48injuries or the types of terrain the subject is traversing. This tech is especially useful for
20:52authorities when people's faces are obscured. While there haven't been any reports of widespread use
20:57of this tech by law enforcement agencies in the US, Chinese authorities have been utilizing it on the
21:02streets of Shanghai and Beijing since at least 2018. In recent years, there have also been a number of
21:07companies working on creating emotional detection technology where AI uses biometric data to determine a
21:12person's emotional state and the likelihood they will become violent or cause a disturbance.
21:17Wired reporting found that Amazon-powered cameras have been scanning passengers' faces in eight
21:21train stations in the UK to try out this new technology. The trials were testing the system
21:26for age and gender recognition, as well as the emotional state of the person on camera. While there's
21:30no current documentation of this tech being used at protests, the BBC reported that emotional detection
21:35tech has been used on Uighurs in China. Some of these could be really invasive because, you know,
21:41reading your emotions, there start to be maybe inferences that someone could make about how you
21:47were feeling in a certain moment that may or may not be accurate, right? Because it's sort of being taken
21:52out of context. So it's difficult to have an algorithm just sort of come to one conclusion. Like,
21:59sometimes I think you're doing your angry walk coming over when I haven't filed my story, but really,
22:05then you're really nice about it. And you're like, it's okay, Lily, you can do it. And, you know, I could,
22:11I took it totally the wrong way. But potentially there are more sort of, in terms of just identifying
22:17someone in a certain place, it is scary that there's something characteristic about your walk.
22:23They're not saying, oh, it's Andrew's angry walk, but they're saying, oh, that's Andrew. Certainly
22:29creating more systems that are replicating what other things like facial recognition do
22:35and applying it to other biometrics of a person that definitely is going to create all the same
22:40concerns as we've seen with these other technologies that were emerging, you know, years or decades ago.
22:46But now it's your entire body, how you walk. And like you mentioned, like if we're having computers
22:51analyze like how I'm feeling in a certain moment, effectively establishing intent of whatever my actions
22:57are in that moment, that gets really scary because it might be completely inaccurate.
23:02Every time there's one of these new AI technologies, there's always some bias built in. There are going
23:06to be people who suffer consequences unnecessarily because these systems are deployed without being
23:11fully debugged. Experts in the AI field have previously noted that emotional detection tech is
23:16unreliable, immature, and some even call for the technology to be banned altogether.
23:23Here are a few simple and effective ways to protect yourself and your personal
23:26information at a protest. First, if you can, leave your phone at home. I know this might
23:31sound drastic, but the most effective way to ensure that your personal data isn't compromised
23:35and that your phone won't fall in the hands of law enforcement is by not having it with you.
23:39If that's not an option, you can put your phone in a Faraday bag so data can't be accessed.
23:43You should also turn off biometrics on your phone, like facial recognition or fingerprint scanner,
23:47meaning you'll need a code to access it. That way your face or fingerprints can't be forcefully
23:51used to access your personal information. You can always say you just don't remember the code to unlock it.
23:55Another thing to keep in mind is posting on social media. Jay Stanley, a senior policy analyst at the
24:00ACLU says if you post something online, you should do so under the assumption that it might be viewed
24:05by law enforcement. You should always check your sharing settings and make sure you know what posts
24:09are public. Try to minimize the amount of other people's faces you capture in your photos or videos.
24:13Use end-to-end encrypted messaging services like Signal when possible. Wear a mask in case photos or
24:18videos are taken. And finally, know your personal risks. Is your immigration status exposing you to
24:23additional dangers? Are you part of a minority group that is more likely to be targeted by law
24:27enforcement? Keep these things in mind for yourself and your loved ones when deciding if you should go
24:32out to a protest. For more information about surveillance at protests, check out Wired.com.
24:36This was Incognito Mode. Until next time.