The Golem's Hands: Automation, Labor, and Who Does the Work
The Maharal didn't create the golem for companionship or curiosity. He created it to work. The Jewish community of Prague faced threats it couldn't handle alone, and the golem was built to do what humans couldn't: patrol without rest, protect without fear, labor without complaint. It was, in the most literal sense, a tool for getting things done.
This is the part of the golem story that resonates most directly with the history of technology. Every major wave of automation, from the power loom to the assembly line to the software bot, has been a version of the same act: creating something tireless to do work that humans find difficult, dangerous, or tedious. The golem's hands are the original automation story.
But the golem story doesn't end with the work getting done. It asks who benefits, who gets displaced, and what happens when the tireless servant changes the nature of work itself.
The Loom and the Luddite
The most misunderstood movement in the history of technology is the Luddites. The common telling reduces them to technophobes who smashed machines because they feared progress. The actual history is more nuanced and more relevant.
The Luddites were skilled textile workers in early 19th-century England who destroyed weaving machinery between 1811 and 1816. They weren't opposed to technology as such. They were opposed to the specific way factory owners were deploying technology: to replace skilled workers with cheaper, less skilled labor and to concentrate the economic gains of mechanization among owners rather than workers.[1]
The Luddites understood something that the golem story also teaches: the question isn't whether the golem can do the work. It's who the golem is working for. A loom that makes cloth faster is a neutral tool. A loom deployed specifically to eliminate skilled jobs and suppress wages is a choice about how the benefits of automation are distributed.
This distinction matters because it reframes the automation debate. The question "will AI take my job?" is a golem question, but it's the wrong golem question. The better question is: "When AI does this work, who captures the value?"
The Productivity Paradox
Economists have long observed a puzzle: automation consistently increases productivity, but the gains don't consistently flow to workers. Erik Brynjolfsson and Andrew McAfee documented this divergence extensively, showing that from roughly 1973 onward, productivity in the United States continued to rise while median wages stagnated.[2]
The golem works harder every year. The community it was built to serve doesn't proportionally benefit.
This isn't inevitable. In the decades following World War II, productivity gains and wage gains tracked each other closely. Workers shared in the value their tools created. The divergence that began in the 1970s coincided with several factors: declining union membership, policy changes favoring capital over labor, and the beginning of the computer revolution that made it possible to automate tasks previously requiring human judgment.[3]
The golem doesn't decide who benefits from its labor. That's a human choice, made through wages, policies, ownership structures, and institutional design. The Maharal directed his golem to protect the community. Modern automation is often directed by different incentives.
The Transfer of Labor
One of the subtler effects of automation is that it often doesn't eliminate labor so much as transfer it. The work doesn't disappear. It moves.
Self-checkout machines in grocery stores are the clearest example. The scanning, bagging, and payment processing that a cashier used to perform is now done by the customer. The labor wasn't automated. It was transferred from a paid employee to an unpaid customer. The store's costs went down. The customer's effort went up. The golem, in this case, is the customer themselves, doing work they didn't used to do.[4]
The pattern appears across industries. Automated phone trees transfer the work of routing calls from a receptionist to the caller. Online booking systems transfer the work of scheduling from an agent to the user. Self-service kiosks at airports transfer check-in labor from airline staff to passengers. In each case, the company reports efficiency gains. The labor didn't vanish. It was redistributed.
AI-powered tools are creating a new version of this transfer. When a company deploys a chatbot for customer service, the straightforward queries get handled automatically. The complex, frustrating, emotionally charged queries still reach human agents, but now those agents handle a concentrated stream of difficult cases with fewer easy ones to balance the load. The golem took the simple work. The humans got what's left.
The Gig Economy as Golem Management
Gig economy platforms represent a particularly interesting golem dynamic. The platform itself is the golem: an automated system that assigns work, monitors performance, sets prices, and terminates workers, all without human judgment in the loop.
Drivers, delivery workers, and freelancers on these platforms often describe the experience of working for an algorithm. The algorithm decides which jobs to offer, calculates pay, tracks location and speed, and can deactivate a worker's account based on metrics the worker may not fully understand. The management layer has been automated, and the automated manager has the golem's characteristic: it follows its optimization function without understanding the human context of its decisions.[5]
A human manager might notice that a delivery driver is having a bad day, or that a particular route is unusually difficult, or that a customer's complaint is unreasonable. The algorithmic manager processes metrics. It doesn't notice anything. It optimizes.
This creates a power asymmetry that the golem story anticipated. The golem is stronger than any individual human. The algorithmic platform has more information, more leverage, and more persistence than any individual worker. The Maharal could direct his golem because he created it and understood it. Workers on algorithmic platforms often have limited visibility into the system that governs their livelihood.
The Creative Displacement
The most recent wave of automation has reached domains previously considered safe from the golem's hands: creative work.
Large language models generate text. Image generators produce visual art. Music generation tools compose scores. Code assistants write software. The 2023 strikes by the Writers Guild of America and SAG-AFTRA were explicitly about AI's role in creative industries, with writers and actors seeking protections against being replaced by, or having their work used to train, automated systems.[6]
Creative displacement raises a question the Maharal didn't face: what happens when the golem can do work that was supposed to be uniquely human? The original golem couldn't speak, couldn't reason, couldn't create. It was powerful but empty. Modern AI systems produce outputs that look creative, even if the process behind them is statistical pattern matching rather than genuine understanding.
The distinction between "produces creative-looking output" and "is creative" matters, but it may not matter to the market. If a client can get a serviceable logo, article, or code snippet from an AI tool at a fraction of the cost of a human, the economic pressure is real regardless of whether the AI "understands" what it's producing. The golem doesn't need to understand art to displace artists. It just needs to produce something close enough, cheap enough, fast enough.
Who the Golem Serves
The golem story always returns to the same question: who is the golem serving?
The Maharal built his golem to serve the community. The community's safety was the purpose, and the golem was the means. When the golem's behavior drifted from that purpose, the Maharal shut it down. The purpose came first. The tool served the purpose.
Modern automation often inverts this relationship. The tool is built, and then a purpose is found for it. Or the purpose is defined narrowly (reduce costs, increase throughput, improve margins) in ways that serve some stakeholders at the expense of others. The golem works for whoever inscribed the instructions on its forehead, and in a corporate context, those instructions typically optimize for shareholder value.
This isn't a condemnation of automation. The golem's hands have done enormous good: reducing dangerous labor, increasing access to goods and services, enabling productivity that supports higher living standards. The question is whether we're directing the golem with the Maharal's intentionality, building it for a purpose that serves the community, or whether we're building golems first and asking who they serve later.
The Maharal knew the answer before the clay was shaped. That's the part of the story worth remembering.
References
[1] Kevin Binfield (ed.), Writings of the Luddites, Johns Hopkins University Press, 2004. https://www.goodreads.com/book/show/2763322-writings-of-the-luddites
[2] Erik Brynjolfsson and Andrew McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, W.W. Norton & Company, 2014. https://wwnorton.com/books/The-Second-Machine-Age/
[3] Lawrence Mishel and Jessica Schieder, "CEO compensation surged in 2017," Economic Policy Institute, August 16, 2018. https://www.epi.org/publication/ceo-compensation-surged-in-2017/
[4] Craig Lambert, Shadow Work: The Unpaid, Unseen Jobs That Fill Your Day, Counterpoint Press, 2015. https://www.counterpointpress.com/dd-product/shadow-work/
[5] Alex Rosenblat, Uberland: How Algorithms Are Rewriting the Rules of Work, University of California Press, 2018. https://www.ucpress.edu/books/uberland/paper
[6] Mark Muro and Yang You, "Hollywood writers went on strike to protect their livelihoods from generative AI. Their remarkable victory matters for all workers," Brookings Institution, April 2024. https://www.brookings.edu/articles/hollywood-writers-went-on-strike-to-protect-their-livelihoods-from-generative-ai-their-remarkable-victory-matters-for-all-workers/