Transcript: When an Algorithm Is Your Boss
This is the transcript of Mozilla’s IRL podcast episode, When an Algorithm is Your Boss from August 1, 2022 (IRL Season 06, Episode 02).
Bridget Todd:
It’s hard to pass up. Ordering a late night meal. Sushi, pizza Pad Thai. You press the complete order button on your phone, and like magic the food appears at your door. You don’t know how it all happened and you don’t need to. But what if you were the one delivering the meal? What do you do when you suspect something isn’t fair behind the scenes? Don’t you have a right to know how the algorithms work?
Bridget Todd:
José Gonzalez is a delivery driver in Quito, Ecuador. He’s frustrated after working for three years with one of the big international apps. He’s come to a May Day demonstration on the streets of his hometown.
José Gonzalez:
[Spanish].
Having an algorithm as a boss is the worst because you don’t have any communication with any other person. It’s an algorithm. You are a human talking to an algorithm with a computer. It is a boss that makes you more precarious than any human boss ever could.
Bridget Todd:
We’ll hear more from José in just a bit. I’m Bridget Todd, and this is IRL, an original podcast from the nonprofit Mozilla. This season, we draw from Mozilla’s internet health report to bring you five episodes on the perils and promise of AI.
Bridget Todd:
Today’s challenge, how do we defend the rights of workers when companies use AI to exploit them? I want to introduce you to someone who is dedicated to answering one powerful question. What if gig workers could open up and analyze the algorithms they depend on?
Eduardo Meneses:
If you take more than 50 minutes, you’re suspended for two days. So that means that for two days you cannot work.
Bridget Todd:
Eduardo Meneses is describing what it can be like to have an algorithm as a boss, if you’re one of the tens of thousands of people who rely on gig apps for work in his country, Ecuador.
Bridget Todd:
If this particular app registers that you were even a minute late making a delivery, you’re barred from working for two days. And that’s a big deal if you’re barely scraping by. There are apps for house cleaning, taxi driving, and more. Not just in Ecuador, but nearly in every country in the world.
Bridget Todd:
Do they pay living wages? Offer fair opportunities? Some do, but the so-called gig economy has added a new technical dimension to the risks for exploitation among precarious workers. And it’s glossed over with marketing language about job flexibility.
Eduardo Meneses:
It is very interesting, because very often the platforms explain publicly that they do not want to recognize the workers as in a formal working relationship with them. Because they need flexibility, right? This is something that always come in the discussion. But in fact, when you go back to see what’s really going on in these platforms and this algorithmic management, you find the opposite. I mean, these platforms are, they’re totally opposite of flexibility.
Bridget Todd:
So Eduardo isn’t actually a gig worker or a driver. He’s the global head of social change at ThoughtWorks. That’s a large tech consultancy. He’s working with a labor association for platform workers and researchers from the FLACSO university in Quito. They’re working on a plan to audit the algorithms by supporting workers to collect their own data. So that companies can’t obscure exactly how much workers are paid or on what terms they’re offered work.
And just as machine learning is used by companies to process data, Eduardo believes that advocates for workers can use data and machine learning to reverse engineer secretive systems of the gig work economy. To him, it’s about amplifying the work of social movements with technology.
Eduardo Meneses:
We have been so much told this narrative about the neutrality of technology, and not only the neutrality, but this tech solution in myth that we do have in our society, that we are…it’s very difficult, not only for the people who are not in the tech industry but even for the people who are in the tech industry, to understand that that’s not true. That, I mean, technology can solve some problems, but definitely technology can be part of the problem. We find all the systemic inequalities, racism, colonialism, all of this is embedded in technology.
Bridget Todd:
There’s a corporate narrative about how gig work platforms empower their workers to be their own bosses. But many workers say the experience of being completely expendable to companies, without a human point of contact, tells a very different story. Many are operated entirely from abroad, with headquarters in countries including the US and China. Here’s José.
José Gonzalez:
[Spanish].
You don’t know how much money you will make in a day. Whether you will get gigs or not. It’s an algorithm. You have different kinds of bosses: the clients, the delivery dispatcher, and the application. You have to keep everyone happy. If any one of them perceives that you did anything wrong or made a mistake, you get sanctioned. And how do they sanction you? By taking away work hours, downgrading you in groups and categories. This means you as a person can’t connect.
Bridget Todd):
Jose says there’s no way to know how decisions are made, no way to appeal, and no transparency. Eduardo says that even the amount of money workers are paid for their time and distance traveled is unclear. His research team is gearing up to audit these apps from the outside, by developing an open source app to collect data for workers.
Eduardo Meneses :
This black box between what the worker does and what the app is giving as a compensation, an economic compensation, needs to be mediated by the possibility of debating what is inside.
Bridget Todd (05:58):
The vision for the app is that it will track the distance delivery workers travel, the pay they receive over time, and possibly even spot inconsistencies with what delivery app users are told workers are getting.
Bridget Todd:
Ultimately Eduardo believes that apps like this can support labor movements and unions to use data to push for policy changes. These companies are touted as technical success stories, but when the research network Fairwork ranked platforms around the world, they found that far too many gig work apps do not offer fair pay or conditions.
Bridget Todd:
So workers protesting algorithms, that’s a real thing. But so is using technology to build alternatives. Tens of millions of workers are dependent on the big platforms, but among the few initiatives where workers are the main beneficiaries, you see worker-led cooperatives for taxi rides and food deliveries. And data sharing platforms for empowerment, Drivers Cooperative, for example. It’s based in New York. Since they started in 2020, they now say they have over 6,500 drivers. And then there’s CoopCycle, based in Europe. It’s a federation of courier platform co-ops and it has members around the world. These alternatives face an uphill battle. They’re competing against giant companies, their tech teams and billionaire investors.
Bridget Todd:
Part of leveling the playing field is making sure that bigger companies don’t gain an advantage by offering an unfair deal to the people that work for them. From his desk at the MIT Media Lab in Cambridge, Massachusetts, PhD student Dan Calacci helps create tools that allow workers to take control of their data. Similar to what Eduardo aspires to in Ecuador. Dan told us about one of those apps, a project called WeClock.
Dan Calacci:
WeClock allows workers to collect information about their daily life and use it to measure their working conditions.
Bridget Todd:
The data is collected through a basic API on your phone. It’s data about your movement, your location. Stuff that’s often sent to companies that want to understand their users better. This time though, it’s the workers themselves who are gathering the data.
Dan Calacci:
And then if I want to share it with an organizer or with a researcher, I can export all of it as CSV.
Bridget Todd:
Like Eduardo, Dan knows that giving workers control over their own data is crucial. Once WeClock has collected data, it can be exported anywhere at the worker’s discretion.
Dan Calacci:
It allows them to take that data and share it with other workers and organizers, so that organizers can use data more effectively in worker campaigns.
Bridget Todd:
WeClock has two code bases, one for Android and one for iOS and it’s open source. So anyone can contribute or audit the code base themselves.
Dan Calacci:
The technical aspect of it is really not so complicated, because our goal is to basically give people ownership and control over the data that’s already being collected from them by other apps, through their device. We’re just using the same libraries and systems that other apps use, but offering it up as a flat file that’s easy to share on your phone.
Bridget Todd:
Another app Dan’s working on, still in beta, is aimed specifically at delivery drivers in the US who are working with multiple apps, as many people do.
Dan Calacci:
You can go in and add your pay, and using all that information it’ll show you things like your average pay per mile, or your pay after expenses, if you input your expenses. Things that are really helpful to tracking your own personal work as a gig worker. But then it also allows you to share that with researchers like me, by using the app, but manually by exporting your data and sharing it with advocates as well.
Bridget Todd:
For Dan, this work is about something more than just pay rates though. It’s also about privacy and basic rights to data from your workplace.
Dan Calacci:
When you produce any information at work, whether it’s an email you send or the key logger that’s on your desktop as an office worker working from home, or your location as a platform worker, it’s not yours. That data is owned by your employer. It’s through IP law, copyright law, and often trade secret.
Dan Calacci:
I think of location data, for example, as a form of biometric data, because it’s so unique. A piece of your location data over just a few hours can uniquely identify you from everyone else on the planet. And Uber and Lyft have that from you, day in, day out, and you should have access to that.
Bridget Todd:
Sometimes the apps Dan works on, do more than just collect data. Consider the Shipt calculator, for example. It was basically doing an audit on an algorithmic change that the delivery app Shipt rolled out in the summer of 2020.
Dan Calacci:
It allowed workers to submit screenshots of their pay over time, to a number. And when they submitted those screenshots, it would automatically infer if workers were being paid by an old algorithm, which was transparent, or a new algorithm, after Shipt had switched to a black box. And it would tell workers how their pay had changed, if they were being paid under the new algorithm.
Dan Calacci:
And more than that, it allowed us to aggregate pay information from workers, hundreds of workers who contributed data across the country, to show that the new black box algorithm that Shipt was claiming was fair, was actually resulting in a significant pay cut for 41% of workers.
Bridget Todd:
The potential for opening up those black boxes and reclaiming workers’ rights is intriguing. But creating fully functional tools is extremely time intensive. And expensive.
Dan Calacci:
One of the things that I found really frustrating, honestly, about working to build tools for platform workers, is that folks who work on apps are used to using apps that are buttery smooth, right?
Dan Calacci:
No problems, built by literally billion dollar companies with enormous engineering and design teams. That are building designs every day to make flipping through the screens and accepting jobs and doing stuff, I mean, sometimes easier, in some ways harder, if they’re trying to use patterns to control your work.
Dan Calacci:
But it’s so hard to compete with that as an independent developer, trying to build something scrappy to organize. And something that’s been really hard, is to do things like beta tests.
Bridget Todd:
Dan hopes that more of the audit work they do will be taken up outside of the research world.
Dan Calacci:
Tools to audit existing algorithms, just like frameworks, tools, examples even, of doing successful algorithmic audits, real world ones, they’re few and far between. And oftentimes they’re just like a paper that’s someone simulating a lot of data and testing some stuff. It’s not testing the real world impact. We need more of that work.
Bridget Todd:
Progress is slow, but champions for workers’ rights are connecting across borders to develop new technical and legal strategies to change the rules of the game. With regulations for AI, and platform work specifically, that change could be accelerated.
Aída Ponce Del Castillo:
We have treated workers as slaves, as robots in a way, since the history of human labor.
Bridget Todd:
Aida Ponce Del Castillo is with the European Trade Union Institute in Brussels. She’s a lawyer and a senior researcher. Aida sees this fourth industrial revolution we’re living through as a particular cause for concern.
Aída Ponce Del Castillo:
This new revolution, so to speak, is also very different because it is very novel in a way that we cannot see it. It has a characteristic of invisibility and immateriality, whereas with industrial revolution, we could see a train, we could see a factory, we could see a machine, we could see an accident with blood and injuries. Today, we don’t have that. We don’t have that visibility of the risks, and the injuries, and the potential harms.
Bridget Todd:
Aida is concerned with that same black box attitude, the arms-length approach that Eduardo and Dan are worried about.
Aída Ponce Del Castillo:
We don’t know what we are dealing with because we cannot see it. Because we cannot deconstruct it, scrutinize it, analyze it, and understand it. It’s very difficult. If you have a cigarette, for example, and you want to understand its risks, you can take it, open, and take the tobacco and analyze it. And you have a specific report.
Aída Ponce Del Castillo:
When you would like to analyze an algorithm, first, you need to know whether there is an algorithm or not, second, good luck in trying to get it from its owner, or developer, or employer, because we don’t know whether that would be possible.
Bridget Todd:
There are laws already being written in Europe that seek to regulate how algorithms are being used in the EU. But as Aida points out, it’s not necessarily workers’ rights that policy makers have been worried about.
Aída Ponce Del Castillo:
The European Commission just drafted, very recently, the AI Act. And it is a product market legislation.
Bridget Todd:
Aida says, the AI Act, as it was initially proposed by the EU Commission in April last year, wouldn’t be enough to protect workers.
Aída Ponce Del Castillo:
We were quite excited, because we thought that the Commission will put forward provisions about rights or new digital rights. And it was not the case. The AI Act is made just to put products into the market. Period.
Bridget Todd:
So Aida and many others work to bridge that gap. Trade unions are lobbying the European institutions and Aida advises them. The Commission eventually proposed additional rules for improving platform work in December. EU laws are negotiated for a long time between many stakeholders, but if this directive is passed, it could improve the lives of workers. At least in Europe. But it could also raise the bar for other countries.
Aída Ponce Del Castillo:
As a platform worker, the first advantage that you will have is that you would be recognized as a platform worker, as an employee of the digital labor platform. And that will change your life because it will give you access to social security, perhaps to insurance. And then you will have some guarantees about whether you will be dealing with an accident, a workplace accident or you will have your social contributions to your pension. Today, that is not the case.
Bridget Todd:
That’s not all the directive would do though. In addition to guaranteeing those protections, it would give workers the right to disagree with computers.
Aída Ponce Del Castillo:
You will be able to challenge the algorithm. This directive has a new right, which is the right to contest. First of course, the right to access the algorithm and the logic behind it. If you as a platform worker, just discovered that you have been affected negatively in why way or another by the app or the deployment of the algorithm, you can ask to the digital labor platform, excuse me, can I exercise my data access right? And please give me the rationale behind this decision.
Bridget Todd:
So Aida is imagining policies that give workers real concrete rights and the policy future she wants to build goes beyond exposing code or usage statistics.
Aída Ponce Del Castillo:
There is a lot of focus on this transparency obligation. There is little focus on how to make this transparency really accountable, meaning by enforcement of authorities or by sanctions like administrative fines, or by giving people more rights to exercise, truly, their rights. To get information, to contest decisions in a meaningful manner.
Bridget Todd:
Europe has been leading the way when it comes to data rights, but policy interventions on AI, as well as platform labor, are brewing elsewhere too. Gig work is becoming a new norm and the fragmentation and automation of labor with AI is happening across many industries.
Bridget Todd:
These issues affect everybody. Remember the Mayday protest in Ecuador, where we met Jose? Yuly Ramirez was also there. She’s a delivery driver and the general secretary of an organization of digital platform workers in Ecuador. Yuly says, when it comes to regulation and Ecuador, the fear of falling behind technologically, is given priority over human rights.
Yuly Ramirez:
[Spanish].
They want to do everything digital, everything robotic. But the human worker will always be there. So where do we fit as human beings? It’s a complete error to think like this. That’s why we’re protesting and why we will keep protesting.
Bridget Todd (19:09):
Okay. So we could be dramatic and say that platform workers are the Canary in the coal mine of our future with AI. But the technology itself isn’t the culprit. It’s what the industry chooses to do with it that matters. We need policies to set guardrails for innovation. And we need people to be able to challenge decisions made by algorithms, in ways that build trust and security for all.
Bridget Todd:
In the rest of our five part series, we’re doing deep dives into healthcare, maps, disinformation, and more. Because AI really is everywhere, but who has the power? We’re talking to people around the world who are challenging the status quo to make AI more trustworthy.
Bridget Todd:
This is IRL, an original podcast from Mozilla, the nonprofit behind Firefox. Follow us and come back in two weeks. This season of IRL doubles as Mozilla’s annual Internet Health Report.
Bridget Todd:
To learn more about the people, research, and data behind the stories, come find us at internethealthreport.org. I’m Bridget Todd. Thanks for listening.