Please flip your device.

AI Stories · The Tech We Won't Build

Scientists, Let’s Push Back

Yves Moreau is a professor of engineering at the University of Leuven in Belgium doing research at the intersection between artificial intelligence and genetics. As a concerned scientist, he is pushing back against the development of mass surveillance technology and abusive deployment of forensic DNA databases.

This is an extended cut of the interview from The Tech We Won’t Build that has been edited for ease of reading.

Could you tell us about your work?

A lot of my work has been focused on DNA databases. The kind of databases that are used by law enforcement across the world. Particularly in China, we do see abuse of that technology, in the form of mass surveillance and facial recognition. I became interested not just in facial recognition, but ethnic facial recognition.

One day, I came across a Twitter storm about a research paper on facial recognition to identify Uyghurs [a persecuted ethnic group in China]. People had written to the editor of the scientific journal where it was published, but received a pretty dismissive answer that it had been peer reviewed. I’m quite familiar with the academic publishing process and the ethics requirements. When I saw the response, I said, “Well no, this doesn’t really quite answer the complaints.” I reached out to a close contact, Jack Poulson, and together we wrote a letter to the editor and to the publisher.

There are actually products from several Chinese suppliers on the market that say they can tag people in video feeds based on their ethnicity. In particular, Uyghurs. So you can’t say, “Well technology can be used for different things, the authors don’t have ill intent.” The dangers are quite obvious, so you should take that into account. Hundreds of thousands of people have been sent to forced labor camps in Xinjiang in China. There is facial recognition everywhere. There is tracking of individuals, and of their behavior.

How often have you succeeded in getting papers retracted?

So far, there have been eight retractions (seven on DNA profiling and one on ethnic facial recognition) and I have requested ethical investigations of around 60 articles at a dozen different journals. But that’s just the tip of the iceberg.

What should scientists consider when they review biometric research?

First, everybody should clearly understand that whenever you’re doing any kind of biometric research – be it DNA, facial recognition, or fingerprinting – it is human subject research. You need the appropriate consent and ethical reviews. There is a tendency for people to think, “Well, it’s just a bunch of pictures and data on the computer. I’m not doing research on human subjects.” I think computer scientists and engineers need to realize that when you do biometric research, you enable the development of extremely powerful tools that can severely impact people’s lives. You need to reflect on the potential dangers of the technology you are developing.

One paper I was involved in retracting, was about predicting whether somebody is a criminal based on their face. I mean, what is it to be a criminal? It’s a very complicated question. So, you take a dataset and you label people “criminal” or “not criminal” based on what? When we look at the science, it’s usually quite poor. People who have a criminal record could be singled out for forever. Free inquiry is a core value of mine. But to say that scientists should be able to do whatever we want unencumbered, doesn’t work. For biometric research today, the threats are so huge that we need to urgently reflect.

Scientists have a huge moral authority over what happens with our research, but we almost never use it. We invented it, we created it, we brought it to the world. We have a special right to talk about it.

When did you start actively pushing back yourself?

About a decade ago, I thought, “I don’t just want to be a scientist writing technical research, I want to take up a role as an intellectual, to enter public debate.” I looked into where I could make meaningful contributions.

I stumbled onto proposed legislation for a surveillance program in Kuwait that would create a national DNA database of the entire population. We were part of a successful effort to push back against that. I moved on to China to look into abuses of DNA databases there. Together with Human Rights Watch, we discovered this infrastructure being deployed in Xinjiang. Later I began looking into the issue of facial recognition and mass surveillance, which I think is the next battlefield for the shape of 21st century societies.

What is your advice to computer scientists?

Technology does not occur in a vacuum. It is not “morally neutral”. In physics, you had the nuclear bomb. In chemistry, you have chemical weapons. In medicine, you can do horrible things to people. All of these fields have reflected on this. In computer science, we don’t have that culture of thinking about our impact on society. But tech people actually have quite an amazing superpower. They can stand up from their chair, and go to the door, and go look for another job. And when you calmly, politely explain there is a problem, it is actually quite frequent people will say, “Yeah, actually I think we do have a problem.”

As scientists closely affiliated with industry and business, there tends to be a strong focus on ‘big science’ that translates into industrial products with a large economic impact. Tech research has become amazingly productive, but in my opinion more disengaged from social issues.

When I turned my attention to China, one of the very first things I said was, “Well, the role of US suppliers in selling these products to Chinese companies is really problematic.” And there was some impact with international attention, but we haven’t solved the problems.

The Dragonfly Project at Google was stopped because people were courageous. When Jack Poulson quit, others followed. Eventually Google changed course. We’re talking about a billion-dollar opportunity. Same thing with Project Maven. A small number of people, or sometimes just one person, can have an impact. I feel incredibly lucky to have generated a discussion on all this. But together, we’re stronger.

Portrait photo of Yves Moreau is by Hannah Yoon (CC-BY) 2022

Mozilla has taken reasonable steps to ensure the accuracy of the statements made during the interview, but the words and opinions presented here are ascribed entirely to the interviewee.