Please flip your device.

AI Stories · The Tech We Won't Build

Tech’s Broken Promises

This is an extended cut of the interview from The Tech We Won’t Build that has been edited for ease of reading.

If we’re talking about the tech that shouldn’t be built, what’s the first thing that comes to mind for you?

My colleagues and I are usually those people in the room, always sort of questioning why something needs to be built in the first place.

For people here in Pakistan and throughout the subcontinent, technology is often approached like a  silver bullet that’s going to fix problems that are actually much more complex. If we just build an app or a piece of technology, it’s not going to bypass these nuanced and more difficult questions based on history and social issues. I think that is the trap a lot of technology development has fallen into. Unless we’re willing to grapple with those larger questions, we can’t use technology as a shortcut.

Some technologies are inherently dangerous, but then there are more indirect ways in which technology can be harmful because we attach more promises to it than it can possibly deliver. 

What is it like to be on the receiving end of technologies by foreign superpowers?

I think the relationship many of us have with technology is one-sided, especially in the Global South where a lot of the apps that we use, the devices that we use, have been built in other places, in other contexts. And that is a really important issue, because if that tech is not built with you in mind, or your needs in mind, then that is a sign that you’re excluded from those conversations. We’re outside that room where those conversations are happening, banging on the door, trying to get our voices heard.

So how do we fix this? Somehow we’re supposed to come up with solutions, as an add-on to what they have built. It’s always an afterthought. It also speaks to the fact that a lot of capital is concentrated in the Global North. That is where the technology is going to be built and designed. These crucial design decisions are made, and only later is the technology adapted to other contexts.

What about surveillance and military use of technologies?

In Pakistan, there’s a lot of criticism around drone technology weaponry which has been politicized and is part of anti-US sentiment in the region. If you talk to people here about it they’ll probably give you a really articulate answer about why drones are bad. Whereas if you talk to the same people about AI or facial recognition, they may not know what you’re talking about.

The first time I read about drones, I was just completely taken aback. It’s such a clinical way of taking people’s lives sitting halfway across the world. That there would be people sitting in parts of the US who feel like they were playing a video game in terrain they don’t really understand, executing these kill lists, was so jarring. It is presented as this very precise way of killing people. But there were so many horrendous reports of, for example, a wedding that was erroneously targeted, or of entire families killed because one person within a household was suspected to have links to terrorism.

You have been critical of other applications of technology in the name of security. For instance, state sponsored mobile phone apps for women’s safety in Pakistan.

These are apps that are asking women to give up all of their privacy in order to have some abstract promise of security from the state.

The essential message to women is that if you want security, then you will have to give up privacy, which is extremely problematic, but also indicative of how the state conceptualizes security itself. It is surrendering yourself to complete and total surveillance as a way of being granted that privilege of security.

Women are hesitant to go to the state for security for a number of reasons. The biggest misconception about violence against women or gender minorities is that they will be attacked in public places, by people they don’t know, whereas we know that an overwhelming majority of violence happens either within the home, or by people you know, so these apps are completely ineffective in that sense.

I know very few women in Pakistan who actually have these apps on their phone. I think it also speaks to the lack of understanding of what actual Pakistani women or people actually feel insecure about in public spaces, and their relationship to technology and the state.

You work for an influential digital rights group in Pakistan. How would you like to see big tech collaborate with local groups?

When you are going to include people in a conversation, I think you have to make sure that it’s not just about ticking some boxes for diversity.

One of the main problems that I see with the consultations that do happen in the Global South, is that companies will have this one-off conversation and then disappear. And then a year later they’ll come back with something else, and they don’t have to answer for the fact that they didn’t act on certain key recommendations or issues you raised. I think that’s the main frustration that myself and other colleagues in the region have raised over the years. That we don’t just want to sit in a room and give you all our expertise, and then the companies and the people building this tech aren’t held to account. I think that’s the really extractive relationship we’ve seen, even when voices are included.

What are your thoughts on collaboration with policy makers?

I think there’s this eagerness to make a policy and hope that somehow that’s going to fix things. We see that currently happening with data protection laws. There is no data protection law currently in Pakistan, but there’s this larger global movement towards data protection laws, especially since the GDPR was enforced in Europe. But there’s this inability to accept that sometimes questions are very complex and we can’t solve it with just one policy. It might require much more structural interventions or maybe several policies. It’s a very difficult position, to be in a room of policy-makers and tell them not to make a policy, because then you’ll get accused of not trying to find a solution. Or if you present any opposition to the data protection policy being offered, you are instantly branded as someone who’s not interested in privacy.

Policy in and of itself is not the solution, nor an unqualified good outcome. In my context, I can see how we have been grappling with the after-effects of bad policymaking or incomplete policymaking, or policymaking that is not inclusive. This has more disastrous effects because especially when it comes to tech, a lot of these conversations are very complicated.

Portrait photo of Shmyla Khan is by Hannah Yoon (CC-BY) 2022

Mozilla has taken reasonable steps to ensure the accuracy of the statements made during the interview, but the words and opinions presented here are ascribed entirely to the interviewee.