Please flip your device.

AI Stories · The Tech We Won't Build

Calling out Data Weapons

Yeshimabeit Milner is founder and executive director of Data for Black Lives. Her new podcast is called Tunnels. She was previously a campaign manager at Color of Change and has headed several major national initiatives, including OrganizeFor, an online petition platform to build up the political voice of Black people in America.

This is an extended cut of the interview from The Tech We Won’t Build that has been edited for ease of reading.

People often think of data as being neutral; when did you first come to understand data as a potential tool of oppression?

I remember being in fourth grade taking the Florida Comprehensive Assessment Test, or FCAT. And teachers at my predominantly Black, immigrant, Latinx school in my predominantly working-class neighborhood said that our FCAT scores were used to determine how many beds to include in future prison projects. That’s when I first understood that data is political; it’s weaponized and used to justify and facilitate political decisions.

Later, people at a neighboring high school organized a peaceful protest in response to a vice principal putting a ninth grader in a headlock. Instead of hearing their protest, the school brought in SWAT teams. I knew that, in addition to protest, there were other ways to make our voices heard. So I joined Power U Center Social Change and we surveyed 600+ Miami students about their experiences with suspensions and arrests in schools.

We spearheaded the implementation of restorative justice policies and practices and the phasing out of zero tolerance policies. If you look at the timeline of the Clinton crime bill and these war on drugs policies trickling into our school systems, you can see how our experience lined up; the data was a powerful way to reveal that history and push for change.

Tell us about the #NoMoreDataWeapons campaign, which is part of your work at Data For Black Lives.

The idea came out of conversations with folks like Samuel Sinyangwe — an amazing data scientist and activist who spearheaded the research around mapping police violence — and conversations I saw all over our hubs around defunding the police. In addition to making a demand for defunding the police, I wanted to narrow it down to say, “How do we push police departments to scale back on expensive, useless, and highly harmful technologies and data weapons?”

Data weapons exacerbate and magnify the already harmful, racist, and punitive carceral system that we’re living under. Instead of waiting until somebody is up for deportation or incarcerated, we’re finding a way to educate people now.

The main goals of the project are to educate the public on what data weapons are, help shift the national narrative via documentation, and promote storytelling by Black communities and individuals who are directly impacted. We also want to inform policy innovation and  cultivate and support targeted campaigns that move forward policy concerning the use of data weapons. All of that is undergirded by the Data For Black Lives methodology, which centers reclaiming data and tools as we fight oppressive data uses.

Right now, we’re working with an amazing team of volunteers from the Brown University computer science department to build a storytelling tool. Every day, we’re documenting information on uses of data weapons, ShotSpotter technology, stingrays, automated license plate readers, DNA databases, drone technology — the list goes on. But so much of the effort needs to also be in telling the stories and shifting the narrative. So we’re building a tool where people can anonymously and publicly share their experiences with data weapons.

What is a data weapon, exactly?

We define data weapons as any technological tool or application used to surveil and criminalize Black and Brown communities. For the purpose of this project, we’re focused on data weapons used by law enforcement. Some forms of credit scoring are also used as data weapons, in addition to some of the new tools we’re seeing around immigration. This includes larger networks and algorithms such as domain awareness systems, as well as real-time crime centers and infusion centers, which are often actual brick and mortar places that are the convergence of these technologies and their uses.

Are there examples of data weapons currently being used around the world that concern you from a US perspective?

We’re really excited about partnering with organizations in Palestine, Western Europe, and South America. Part of the reason for our international focus is that we believe in solidarity. But it’s also because a lot of these technologies are built the same, oftentimes by the same companies. Palestine has become a testing ground for data weapons. We saw in Ferguson and in the past, especially since the inception of the 1033 program — which provided local governments all over the country with funding for military equipment to use on civilians.

Can you speak a bit about these companies?

Our strategy is to focus on bigger companies like Facebook, Google, and Microsoft. But a lot of them are small companies that specialize in the creation of these niche, boutique data weapons. One company, Clearview AI, has pioneered facial recognition technology that — thanks to activism and research — has been exposed as harmful. They have a database of more than three billion images, many from social media applications.

What data do you want to see created, and who should develop it?

These past two years have shown that there needs to be an investment in preventative healthcare, housing, and education, and we need data that’s going to achieve those things rather than increase policing and surveillance.

Law enforcement has way more technology than anyone else. If folks at the health department were equipped in the same way, we would’ve had a way different response to COVID-19. In response to states not releasing information on how it was impacting folks, we developed an entire codebase so people could run their own analysis and build their own tools.

We have an opportunity to use big data and machine learning to exponentially increase the capacity of directly impacted people and lead us to better understand deeply entrenched social issues. A lot of that looks like connecting scientists and activists.

How do you navigate when to collect data about a community versus when the collection of data itself is the weapon?

For us, it’s less about when to collect data about a community than about when people in the community identify a problem and data collection is a way to approach the problem. Data helps us understand an issue that wouldn’t be possible otherwise. It’s a powerful tool for accountability and to amplify the voices of those who traditionally wouldn’t be heard.

For example, in 2013, I worked on a breastfeeding campaign in Miami. At that time, Black babies were three times more likely than white babies to die before their first birthday. That disparity had remained the same for 50 years, even though infant mortality had decreased significantly nationally. Part of the reason we even started talking about mortality was that folks from our community — Black mothers, Latinx mothers, poor mothers — were having C-sections when they didn’t want them. We also realized that mothers who wanted to breastfeed were instead given formula.

That’s where we got the idea to create a scorecard for all the hospitals in Miami, Florida. We developed it by interviewing and surveying mothers all over the city, including a woman who experienced the death of her child in a hospital.

We took our findings to our local hospital CEO. We had been trying unsuccessfully to get a meeting, but once we collected the data and wrote the report, not only did the hospital take our meeting, but it couldn’t deny the data we collected. Officials adapted the specific policies that we identified and pushed for.

What message do you have for researchers and data scientists working on projects using data on vulnerable communities?

I would ask some questions: Who else should be at the table? Who are the people behind these numbers? Once people start asking themselves those questions, I think they are going to think differently.

It takes a lot of imagination and creativity to move out of the rigid definition of how data should be used, to think differently and reclaim it to make tools that are not oppressive. I think part of the problem of racism is that it’s a problem of imagination; we are so used to these rigid structures and ways of seeing things. That’s what stereotypes are, that’s what narratives are. But we have an opportunity with technology to break through that. I invite people to come to our next Data For Black Lives conference, where we’ll have the folks who are building these tools and the folks who are directly impacted by the tools not only at the same table, but in conversation and in community. It’s extremely powerful. We believe that is how change is going to happen.

Portrait photo of Yeshimabeit Milner is by Jed DeMoss © 2022

Mozilla has taken reasonable steps to ensure the accuracy of the statements made during the interview, but the words and opinions presented here are ascribed entirely to the interviewee.