
The AI Revolution Comes With the Exploitation of Gig Workers
Business process outsourcing (BPO) companies manage the human work behind AI development. However, they face accusations of worker exploitation, underpayment and wage theft. Big tech companies benefit from this work model.

In December 2024, John, based in the UK town of Brighton, was looking for extra money to supplement his income, as he was about to begin a PhD in literature. An online ad for training large language models (LLMs) for a US-based company in the AI sector, Outlier, caught his attention.
Outlier is a key player in this sector. Its parent company is Scale AI, which has a major financier in Peter Thiel, former CEO of PayPal, and the first outside Facebook investor.
Artificial Intelligence systems excel at tasks such as translation, data processing and computer programming or holding natural conversations. To perform these tasks, the models require training. John would be prompting AI systems, asking them questions and then evaluating the responses for correctness, conciseness, and suitability. This process, «AI Training», is designed to improve the systems’ responses.
AI Training is a burgeoning employment sector with hundreds of thousands of people performing these tasks. John’s new job seemed low-stress, sitting at home and typing. However, the starting wage was low for such a game-changing industry – only 15 dollars an hour, which at that time was a few cents above the UK minimum wage.
The company did not pay John for learning time. After three months, he had spent about 36 hours on paid work and 12 hours on training, which means that a quarter of his time was unpaid. «This brought my earnings under the national minimum wage,» John explains. If he worked longer hours than were expected of him, the overtime was at a reduced rate. Extra tasks for training also paid him less money. «The pay scale was wacky,» he says.
He would train for a project, only to find out it was not happening. «I wasted two hours preparing for no project,» he says. «There should be a reasonable pay for time spent, regardless if it’s training or whatever.» New tasks were few with long pauses in between. There was barely any support. «I’m still on their system but honestly can't be arsed with it,» he concludes.
John is not alone. Many remote workers in Europe and the US who work on training and fine-tuning large language models for generative AI receive no payment for their own training, meetings, breaks, discussions with managers, lunch hours, and holidays. On top of that, their payment is reduced for overtime. This method of cutting wages to the bare minimum is similar to the one applied everywhere in the gig economy.
Online gig work is fast becoming a key sector of the global economy, accounting for 4.4 to 12.5 percent of the global labor force, according to a 2023 World Bank Report. There are 154 to 435 million online gig workers around the globe.
To grasp the scope of the exploitation of data workers worldwide in this critical field, we contacted over 200 data workers on four continents and conducted over 50 in-depth interviews with contractors for Outlier and competitors. Nearly all of the workers we spoke to were involved in LLM training, such as writing prompts themselves or evaluating outputs of prompts written by others. Such tasks can involve specialist knowledge of data processing. Other tasks involved identifying «inappropriate» responses, such as violent or other sensitive content, factual or grammatical errors, or identifying health and safety violations.
Training the machine at no cost
At Outlier, the hours gig workers spend to be trained and in meetings are mostly unpaid, according to workers from Portugal, the US, the UK, Argentina, and Germany.
«[What I earn] very roughly corresponds to the absolute bare minimum wage,» says a Portugal-based language specialist at Outlier, «though the value can vary from project to project.» If including all the meetings, admin and training, the worker says it would «for sure» be below the minimum wage. This also comes with zero benefits. Another Portugal-based language worker for Outlier estimates the hourly rate, again, if including all unpaid work on the platform, at around $4, a US Outlier worker estimates the rate at less than $5, and a worker in Germany at $7. All these rates are under the respective countries’ minimum wage.
In the UK, Outlier does not even hide in some job ads that training AI is paid below the minimum wage of £12.21. To add insult to injury, the company adds: «For non-core work, such as during initial project onboarding or project overtime phases, lower rates may apply.» One ad did not mention that these lower rates for onboarding may be zero. This is a form of «wage theft.»
Milagros Miceli, sociologist, computer scientist, and head of a research unit at the Weizenbaum Institute Berlin,argues that BPO companies have always followed the strategy to «give the impression that training is a form of qualification,» which is why it would already be a bonus and as such unpaid. Mostly, these skills are useless outside the project’s context, and cannot be applied to other projects, so are of even less value than unpaid internships. «Unpaid time that is attached to this type of work is a form of exploitation,» she adds. «This has been business as usual for those companies and platforms for a number of years,» adds Antonio Casilli, Professor of Sociology at Institut Polytechnique de Paris and author of «Waiting for Robots. The Hired Hands of Automation,» Antonio Casilli. «Since the beginning, they have been predicated on wage theft. By and large, training is self-administered, which a worker has to do on their time and on their own dime.»
A 2022 study for which workers in the Global South were interviewed stated: «On average, respondents work 22.7 hours per week on the platform. Of these, 7.8 are unpaid and 14.8 are paid, meaning that on average, approximately 34 percent of all time spent working on the platform is unpaid.»
One Outlier data worker in India worked for around three days for approximately five hours a day. «My task was to analyze, validate, and compare two answers provided by two different AI models for a particular prompt. Initially, when the amount was around $5, they didn't make the payment after a week [had] passed. Still, I did proceed to work and [earned] approximately $20-$30. When I was waiting for the payment, they sent an email saying that I had breached their policy and suspended my account. When I reached out to them, I didn't get any proper response. I was paid nothing at all.»
When we approached Outlier with accusations of accusations of wage theft, the company replied, but would not give answers to our specific questions.
Scale AI: a «$25 billion» company «engaging in wage theft»
Outlier’s parent company Scale AI, was founded in 2016 by the then-teenage American tech prodigy Alexandr Wang to provide data annotation and labelling workers, and has since branched out into training LLMs.
Former PayPal CEO Peter Thiel’s Founders Fund made a $100 million investment in Scale AI in August 2019. Meta and Amazon have since become investors, with Meta taking the lead most recently. The company sought a valuation of 25 billion dollars in March this year (it was valued in 2024 at $14 billion) and is labelled a Silicon Valley «Unicorn» by the tech industry.
As of March 2025, Scale AI’s customers have included Europe-based business consultancy giant Accenture, Germany-based SAP, and UK-based Deloitte. (All three companies have been contacted to comment on their relationship with Scale AI/Outlier. None responded.) US tech giants Meta, OpenAI, Anthropic, and Microsoft are also clients, as well as Canada’s Cohere, The White House, and the US Military.
Outlier was founded in 2023, «dedicated to advancing [Generative AI] through specialized human expertise,» states general manager Xiaote Zhu. According to the company, it drives AI advancement by offering data workers «a flexible, low-commitment option for earning additional income,» which is a euphemism for precarious and informal human work. Outlier does not refer to its staff as employees but as «contributors.»
In the past year alone, tens of thousands of contributors from around the world have earned hundreds of millions of dollars on Outlier, says Zhu, giving the impression that the billions invested in and earned by the company trickle down to the workers. In 2023, Scale AI even told Forbes magazine they were committed to paying workers «a living wage.»
This is not happening.
Outlier builds a screen between its clients and the data workers. The clients and projects have code words, such as Cabbage Patch, Jellyfish Rubrics, and Laurelin Sun. Mostly, the staff do not know who the clients are. «It’s strictly forbidden to discuss our suspicions,» says a Western-European based language specialist at Outlier. Outlier workers must sign non-disclosure agreements (NDAs), which is also standard procedure at similar companies. The company also forbids employees from talking about job specifics outside their work. «They tell us to avoid talking about what we do,» says an Outlier contractor.
We accessed an internal document that lists Google as an Outlier client, presumably using their service to train the Gemini model. Meta, OpenAI, and Alphabet (Google) also appear to be clients, as their names are mentioned in a Californian court case.
Automated systems track workers' performance
Outlier’s overtime policy only works in one direction. The payment does not correspond to the amount of time spent per task. A Western-European based language specialist at Outlier says: «One-hour tasks, on the whole, are too hard to do in one hour, and a worker might soon realize they are spending far more time not getting what they expected to earn.»
An automated on-screen timer by US-based remote work software firm Hubstaff monitors the staff and records the work time. Data workers can pause the timer to have a (toilet) break. Afterwards, they resume the timer, so even toilet breaks are excluded from work time.
If data workers need extra time, they can either abandon the task and stop being paid, or continue the task at a reduced rate. «Since the alternative is getting nothing, this is seen as better than nothing,» says an Iberian Outlier worker.
If the task is still not completed after an allotted overtime, data workers are not paid at all, as two data workers from different continents confirmed. (This was not the case in all projects.) «When the timer stopped, the whole thing I worked on was scrapped,» says a North Carolina Outlier worker. «I spent two hours reworking a document. [The system] flagged two issues to correct before the clock ran out. I found and fixed one, but the other was not apparent. Nothing I tried could fix it. The clock wore down and I was facing zero payment for that task,» reports a US-based PhD teacher.
Milagros Miceli calls the practice of not paying for allegedly uncompleted tasks «super common.» «The funny thing is that workers wouldn't describe it as wage theft,» she says. «It's so naturalized that they would describe it as: ‘Oh, this is the way it is. This is the gig economy, right?’ Like you get paid per task. And if the task takes longer than what the requester said it would, then that's the way it is.»
Work as a «lottery»
The lack of steady work is another major problem data workers have to face. Gig work is not a reliable source of income. This is the virtual form of going to the factory gates every morning to see if day laborers are being hired.
«Task availability becomes extremely limited due to the increasing number of workers,» says a Portugal-based language annotation worker at Outlier. «Some people stay up all night to grab tasks as soon as they appear. I sometimes receive an email around 1:00 AM, notifying me that tasks are available, but by the time I wake up, they’re already gone. It often feels like a race to secure tasks and earn money. It feels more like a lottery than a stable workflow.»
All Outlier workers fear the «Empty Queue» when no more work is available, a situation with which other gig workers are also all too familiar. An AI trainer for Cyprus-based generative Malaysian support firm Mindy Support reports: «The wages are higher than minimum wage in my country, but the job volume is too small, so I can't live only on it.» An Indonesia-based data annotator for Mindy Support knows this experience first-hand: «Annotation is my passion. But there is no project available this month. Very soon, it's possible I will search for another job.» Germany-based workers also lament a lack of work. «After completing some tasks, I faced long periods without new assignments or feedback, which became frustrating,» says a graduate Outlier worker. «I only get tasks once a week», says an American student in Germany, working for Outlier. «Most of my time is spent with pretty much nothing to do.»
Instability is «one really big disadvantage for workers,» summarizes Mira Wallis, associate member at the Berlin Institute for Migration Research at the Humboldt University, who has researched data workers’ experiences. «If the AI company doesn't need them anymore, it can stop the next day. There is a contradiction. On the one hand, I found people saying: ‘This job gives me security, if the economy is in crisis, I can always work online.’ But on the other hand, they say: ‘This is not a stable job, because it can stop any day.’»
«The sordid underbelly propping up the generative AI industry»
Dubious employment practices are so prevalent in generative AI support, that the companies behind it have been targeted in three major court cases in California.
In December 2024, former Scale AI/Outlier contractor Steve McKinney launched a lawsuit against Scale AI for «wage theft and misclassifying workers» in the San Francisco Superior Court. «Scale AI is the sordid underbelly propping up the generative AI industry,» reports the complainant, alleging that the company employs «an army of misclassified independent contractors responsible for the generative Artificial Intelligence boom.» The tactics depicted in this document match what workers told AlgorithmWatch: unpaid training time, too few hours to complete tasks, and unpaid overtime work.
Also, the complainants accuse Scale AI of «bait and switch» tactics of employment. «McKinney was promised a pay rate of $25 per hour during his application and onboard process but was ultimately paid only a portion of the promised amount.» The defendants have failed to provide the minimum wage under Californian law, alleges the complaint.
Another case filed in California in January 2025 claims Scale AI «willfully and knowingly failed to pay premium overtime» and money for rest periods, as provided under California law, to a plaintiff based in San Diego, employed as an annotator by Scale AI in 2024. In October 2024, Scale AI, Outlier, and HireArt were also hit with a lawsuit against violating federal and state laws by terminating more than 500 workers without the required 60-day advance notice. The case was brought to the US District Court for the Northern District of California.
Deceitful flexibility
Outlier’s «Terms of Service» state that no «promise of work» binds the data workers and the company. Workers only have to do the tasks they agreed to take on, at their convenience, using their own equipment. «It’s up to me to decide when I work, how I work, and for how long,» says one contractor. Workers have full flexibility, the employer has no obligations, such as paid vacation, breaks, overtime, paid maternal leave, or any of the rights that unions have fought for over the last 150 years
Considering this, it does not come as a surprise that none of the data workers we spoke to see their jobs as a career opportunity. «It’s just a desperate way to get some money when you’re unemployed,» says an Iberian-based language specialist for Outlier. The job profile particularly attracts women who are single parents or care for elderly relatives, as they can fit the work around their family obligations. But the flexibility has a downside.
«Flexibility might function as a new type of social compromise that people are willing to make,» says Mira Wallis. «[Contractors] have to deal with all the negative consequences of platform work, in order to have this flexibility. But when you ask them, what this flexibility is actually about in your everyday life, a lot of people are still struggling with combining care work and wage labor on platforms, even though they are working from home, because it doesn't make many things easier. One woman told me all the difficulties of platform work, but then she added: ‘But at the end of the day, I know that I'm free.’»
A lack of sufficient public social provision for single mothers works to the benefit of platform work providers, like Outlier. «For some, [IT contract work] is the best of bad alternatives,» says Wallis.
No place to go
If a company such as Outlier operates a platform in the US while workers are dispersed around the world, there is literally no place to go to assemble, organize, or file complaints. «Even Uber drivers or delivery workers meet on the streets,» says Miceli. «Data workers don't know each other, other than maybe on Facebook groups. That is a major problem when it comes to collective bargaining.»
The ones really benefiting from this model are the big tech companies. They must be held responsible for what happens in their supply chain, as they are the final beneficiaries of the data workers’ labor. The BPO company Samasource was hired by Facebook to analyze sensitive content online, including violent imagery. Samasource hired moderators in Kenya. Since then, the workers have suffered PTSD and sued the beneficiary of their work, Facebook’s parent company, Meta.
The upcoming EU Corporate Sustainability Due Diligence Directive (CSDDD) mandates parent companies to take necessary measures to prevent serious violations of fundamental social rights in their value chains. This is a legislative tool that exploited workers could leverage. In Germany, all companies with over 1,000 employees must ensure a range of standards on human rights throughout their supply chain. If a large corporation hires a San Francisco-based data annotation or AI support company, the contracted workers’ rights must be ensured.
AI promises prosperity but delivers instability
Acolytes of generative AI consider it an engine of global wage growth and productivity. To back up his thesis «Why AI will save the world,» AI guru Marc Andreessen states: «Productivity throughout the economy will accelerate dramatically, driving economic growth, creation of new industries, creation of new jobs, and wage growth, resulting in a new era of heightened material prosperity.»
This theory conflicts with the experience of AI data workers. Contractors are experiencing the worst aspects of hyper-capitalism: wage-dumping, no benefits, breaches of workers’ rights, exploitation, no guaranteed work. This model is perpetuating instability and precarity. «The danger is not that robots will take humans’ jobs, but that humans are having to work for the robots,» Antonio Casilli says.
Meanwhile, John in the UK has now embarked on his PhD on supervillains in literature, which is progressing well.
«I have not done any more work with Outlier,» he tells us.
Michael Bird
Former Fellow Algorithmic Accountability Reporting (2024-2025)

Michael Bird is an award-winning investigative journalist and writer, most recently of Bears Uncovered, a project showing bear-human conflict in five countries, and fraud in the AI supply chain for AlgorithmWatch's fellowship alongside Nathan Schepers. His work has appeared in publications such as The Independent on Sunday, Vice, Mediapart, taz, Tagesspiegel, EU Observer and Business Insider, and he has contributed to BBC Radio and Deutsche Welle.
Nathan Schepers
Former Fellow Algorithmic Accountability Reporting (2024-2025)

Nathan Schepers is a software engineer with 20 years of experience in organizations of various types and sizes. During his time as an Algorithmic Accountability Reporting fellow, he has worked together with Michael Bird as a researcher, reporter and technical consultant on the working conditions and other work-related issues identified in the supply chain of AI-training companies.