Everyday impact on people around the world whose work and social lives are being affected by machine-learning algorithms, automated decisioning, predictive scoring and other calculative technologies
📚 You can buy books mentioned in this newsletter from my page on Bookshop.org
💭 If you are in London this coming Thursday 6 February, Future Inventions Labs returns with a reading group and guided discussion on Ancestral Technologies at ustwo’s offices in Shoreditch. I went to their first reading group last November and enjoyed it hugely. It is free and starts at 6:30pm. As I write, only a few tickets remain — so book now on Eventbrite and see you there!

AI. Aren’t you tired of coming across these two letters? It is increasingly difficult to avoid breathless claims about what this family of technologies might change — the decimation of white-collar jobs, the mass redundancy of teachers, or (and let’s not overegg this) the almost-quite-but-not-mathematically-certain obliteration of humanity.
Code Dependent: Living in the Shadow of AI (Picador, 2024) is written by Madhumita Murgia — a writer specialising in technology and artificial intelligence at the Financial Times for over a decade, who since 2023 has been appointed their first Artificial Intelligence editor. In Code Dependent, Murgia sets aside the more common hysterical claims of impending doom to illustrate the impact of AI on people’s social lives, in public services, and workplaces. Code Dependent takes us across many continents, gathering intimate knowledge of how calculative technologies impose upon the lives of everyday people who “live daily alongside automated systems built on data” (pg. 4).
The scope of Code Dependent is expansive. In its 300 pages, we witness ways in which people’s lives have been affected by large-scale calculative systems in places as far removed as Kibera and Kolkata, Mexico and Mosul, India and IJburg.
From a young woman whose self-esteem is shattered by the accidental discovery of a pornographic deepfake of herself she cannot get deleted from the internet — not a new problem, but one that has gained a renewed sense of urgency now that the world’s biggest popstar fell victim to its poisonous effects. We read of the mother of a teenage boy under house arrest in Amsterdam due to an algorithmic risk assessment scoring him as one of the “Top 600” most likely to reoffend based on a past arrest. We learn of the UberEats courier suspicious of the opacity of the algorithm setting his wages who ends up reverse-engineering its code (a rare example of someone successfully fighting back).
Regardless of our vigilance or ambivalence towards calculative technologies that sift, surveil or sort huge amounts of data about us, Code Dependent shows us we can seldom escape the effects these AI-powered systems have on the way we live. A shadow is already being cast on millions of lives — our lives — worldwide.
Digital piece work
If you struggle to grasp the mundanity of how AI impacts people’s livelihoods, agency, dignity or peace of mind — you are not alone. Aside from the difficulties we encounter in trying to understand how this technology works, the more testing challenge we face is making sense of sprawling systems that involve the labour of thousands of people spread around the world, many of whom are unaware of the part they play in keeping the wider ecosystem operational.
“Displaced Syrian doctors train medical software that helps diagnose prostate cancer in Britain. Out-of-work college graduates in recession-hit Venezuela categorize fashion products for e-commerce sites. Impoverished women in Kolkata’s Metiabruz, a poor Muslim neighbourhood, have labelled voice clips for Amazon's Echo speaker. Their work couches a badly kept secret about so-called artificial intelligence systems — that the technology does not ‘learn’ independently, and it needs humans, millions of them, to power it.” — Code Dependent, pg. 22
Code Dependent centres its narrative on everyday workers — people, like you or me, trying their best to get by. People living month to month who need some sort of safe, paid work. People who need to squeeze in a few hours of work here and there to fit around their family or care commitments.
In the opening chapter, we are introduced to Hiba, a refugee fleeing the Iraq war. She had fled with her family via Türkiye, before settling in Bulgaria, settling in a country whose language she could not speak. Through a refugee assistance charity, she found an opportunity to learn English and do some simple IT work at home.
The data labelling work that Hiba does is repetitive — matching or identifying differences between image samples. Depending on the number of image pairs she can gets sent to work on and how fast she can work, she earns around €4 per hour. Sometimes her youngest son gets involved and does some data labelling work too — it’s easy enough for nimble fingered children; almost like a game. It is not uncommon that children take on data-labelling work.
The piecemeal work that Hiba performs is broken down to an atomic level (“she has labelled satellite images of fields and oceans and towns, annotated road scenes, tagging pedestrians, traffic lights…”), so much so that she has no context about why this work is needed — and you could argue that this is by design, because once workers begin to learn which company requested their work, why their work is needed, and what other data labellers in similar situations to their own are required to do — then questions of fairness, workplace safety, and ethical management starts to come up.
There is an inherent power asymmetry woven into the fabric of digital piecework — data labellers are not in control of how many cents they are offered per task; the demand for their labour changes for reasons unknown to them; the acceptance criteria for their labour is equally fluid.
Hiba’s story is typical of a generation of workers doing ghost work scattered around the world, all of which underpins the functioning of this thing we call AI.
“Even the objective of data-labelling work felt extractive: it trains AI systems, which will eventually replace the very human doing the training.” — Code Dependent, pg. 24
What Murgia does so well throughout Code Dependent is to humanise these individual stories and place them in an ecosystemic context. On one hand, we can see the trajectory that Hiba is on. She is making a modest living in her home, and will perhaps gain an additional qualification to help her progress to work that is more rewarding. In retaliation, we could counter this by arguing that Hiba’s labour is the reason AI can function.
Without Hiba’s labour — part of an AI labour supply chain that the big tech companies go to extreme lengths to avoid drawing attention towards — the magic of generative AI, speech recognition, image generation, and wayfinding technologies would probably stop working within days if not hours.

Hiba is one of the lucky ones because there are thousands more like her who deal with sifting through much more harrowing content.
For data labellers in East Africa, many of whom deal with weeding out the most extreme excesses of sexual, violent, and abusive content — so wealthier consumers are not confronted with unmoderated content on their phones or computer screens — the spur to find fellow workers and organise is treated with greater urgency, if only for the preservation of their mental health. All the while, there is a lingering suspicion that this monotonous work, the endless tagging and matching, is a stopgap until the machines can do this on their own without the involvement of a human.
Is it all bad?
Tuberculosis is a lethal disease. Although largely tamed in the West during the latter half of the twentieth century, rates have recently begun to tick up. In some parts of the world, vigilance towards its detection, isolation and treatment has never dimmed.
While much of the Western tech media froth with hyperbole about the transformative potential of AI, Murgia draws our attention to a trial taking place in Chinchpada, a small village near the Western coast of India.
Using a new digital app powered by machine-learning algorithms trained to pick up visual patterns from X-rays, medical practitioners can test for TB in remote and rural locations much more easily. It has always been an impossible mission to bring heavy equipment out to remote villages or transfer weak, symptomatic patients to cities. By enabling screening with inexpensive equipment, suspected cases — and crucially referrals — can be made at a much earlier date.
As with many trials using novel technologies, a big hurdle is scaling up and funding its wider rollout. The lure of more lucrative markets for the app’s vendor suggests that the benefits of this technology have been realised at the expense of its “training” being executed in the Chinchpada.
Who knows where we are on the hype-scale of AI. If you were after an accurate prediction of what 2025 will bring, then this is the wrong place to find it. What Code Dependent does for us — which many of us would be well served to ponder — is to pull back to show the vast expanse of lives being transformed by large-scale technologies. AI is an inherently asymmetrical technology — in the way data is extracted and gathered, the capture of knowledge in most algorithms work, and the lack of transparency in how the operations rest on the labour of unknown and unnamed data labellers around the world.
The glimpse we get into the lives of people in Code Dependent shows how the agency they — or rather, we — possess is shrinking when grappling with technologically mediated work and leisure. Some, through grit and determination, have managed to claw back some agency from a technological ecosystem that is designed to devalue their labour. We should be reminded that the code we deploy and the digital services upon which we innovate — each have long-reaching effects. If the work we do is in the service of one or more of these expansive digital services, we would be well advised to think critically about where the shadows of our work fall and whose light it dims.
Code Dependent: Living in the Shadow of AI by Madhumita Murgia. Published in 2024 by Picador, 320 pages.
More from Madhumita Murgia
Madhumita Murgia is the Artificial Intelligence editor at the Financial Times and has been writing prolifically about technology for over a decade. I have chosen a few of her articles which have had a lasting impact on me, particularly for the insight they give into the everyday effects of the careless deployment of AI.
My identity for sale (WIRED, 2014)
AI’s new workforce: the data-labelling industry spreads globally (Financial Times, 2019)
How one London wine bar helped Brazil to cut crime (Financial Times, 2019)
How actors are losing their voices to AI (Financial Times, 2023)
Algorithms are deciding who gets organ transplants. Are their decisions fair? (Financial Times, 2023)
The race for an AI-powered personal assistant (Financial Times, 2024)
The global AI race: Is China catching up to the US? (Financial Times, 2025)
Explore further
Listen
Transcripts are available for Policed by our data, and Resisting Mental Health Ward Surveillance with Stop Oxevision.
Read
Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass by Mary L. Grey and Siddarth Suri (Harper Business, 2019, 254 pages)
Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence by Kate Crawford (Yale University Press, 2021, 288 pages)
Family Units by Julian Posada (LOGIC, 2021)
Biometric Britain: The Expansion of Facial Recognition Surveillance by Silkie Carlo, Jake Hurfurt and Madeleine Stone (Big Brother Watch, 2023)
How AI reduces the world to stereotypes by Victoria Turk (Rest of World, 2023)
Inside the Suspicion Machine by Eva Constantaras, Gabriel Geiger, Justin-Casimir Braun, Dhruv Mehrotra, and Htet Aung (Wired, 2023)
More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech by Meredith Broussard (MIT Press, 2023, 234 pages)
Under The Calculative Gaze by Sanela Jahić with Dan McQuillan (Aksioma, 2023, 215 pages)
We tested ChatGPT in Bengali, Kurdish, and Tamil. It failed. by Andrew Deck (Rest of World, 2023)
Everyone Is Judging AI by These Tests. But Experts Say They’re Close to Meaningless by Jon Keegan (The Markup, 2024)
Facial recognition cameras in supermarkets ‘targeted at poor areas’ in England by Shanti Das (The Guardian, 2024)
The Good Robot: Why Technology Needs Feminism edited by Eleanor Drage and Kerry McInerney (Bloomsbury, 2024, 272 pages)
The Limits of Data by C. Thi Nguyen (Issues in Science and Technology, 2024)
Long days, longer battles by Callum Cant (Vittles, 2024)
Tech bros need to realise deepfake porn ruins lives – and the law has to catch up by Luba Kassova (The Guardian, 2024)
UK banks prepare for deepfake fraud wave by Akila Quinio (Financial Times, 2024)
The young people sifting through the internet’s worst horrors by David Pilling (Financial Times, 2024)
‘It’s a nightmare’: couriers mystified by the algorithms that control their jobs by Robert Booth (The Guardian, 2025)