Digital Economy

The human factor

Von Shoshana Zuboff
 - 09:00

In legend, Newton watched the apple fall from the tree but he actually saw something quite different: an invisible force powerful enough to draw the apple to it. Had he been a Silicon Valley engineer or an economist, it’s likely he would have been captivated by the falling object — “Wow, look at this cool apple!” He might have written a treatise on the aerodynamic properties of dense spheres, developed an algorithm to simulate its motion, or calculated the efficiency of its earthbound path. Instead he endured accusations of medievalism and sophistry to assert the theory of universal gravitation, identifying an unseen force intrinsic to every material body and able to exert its power over hundreds of millions of miles with no detectable mechanisms or means of transmission. To consider the electronic condition, it helps to think like Newton— especially when it comes to the economy and our prospects for the future. Like gravity, hidden forces pull digital technologies and determine how they “fall” into our economies and our jobs. We have to detect and name these forces , if we want to challenge an shape them.

A sense of doom and helplessness has planted itself in our public conversation. We watch like deer caught in the headlights, as economists, technology mavens, and CEO’s enthuse deliriously over the new digital capabilities. They say the machines will be able to perform nearly all our jobs, and that the implacable laws of the market favor replacing people with ever-cheaper digital power in the form of robots and algorithms. In this telling, humans are pitted against the machines in a deadly race. Some of these folks even seem puzzled about what kind of contribution humans can even make to this robotized future. They are so saturated in the machine world, that they can’t recall what humans do or why.

Losers of the new economy

Earlier this month, Google founders Sergey Brin and Larry Page participated in a rare public “chat” with fellow billionaire and venture capitalist Vinod Khosla. Page, explaining Google’s investments in machine learning and robots, anticipates a world in which machines do nearly all the work. With nothing left to do, he suggests that people will be happy to “have more time with their family or to pursue their own interests.” Page’s utopia omits a critical element: most of us are not billionaires. Will the owners of capital really redistribute their profits so that we can all take a permanent vacation from jobs that, in Page’s opinion, are mostly unnecessary anyway? If Page were to abandon his bubble for a day, he might be interested to discover that for the rest of us unemployment does not mean leisure; it means struggle, insecurity, and ever increasing social inequality. Not even those at the very top of the occupational heap are safe from this new anxiety. As Bill Gates recently put it, “20 years from now, labor demand for lots of skill sets will be substantially lower.”

There’s only one problem with this perspective. It’s a magic trick. In fact, there is nothing inevitable about how digital technologies are used. Like any good trick, the dominant narrative misdirects our attention to the apple of the digital, when the real forces that determine the apple’s path are hidden from view. What are these hidden forces? They are narrowly conceived business models and economic assumptions that reward cost reduction, especially lower wage bills, above all else. In many situations, their precepts are nothing more than superstitions invoked by the powerful to perpetuate the status quo. There is not one best way for markets or technologies to work. Indeed, there are sound reasons to think that this view of the future is an evolutionary dead end, like a toothed-bird, in a larger story of capitalism. Instead of the apocalypse, digital technologies can proclaim a new human turn in economic history.

Let’s briefly zoom in and take a closer look at the language of inevitability and digital determinism, so we can later zoom out to see what hidden forces might be at play behind these formulations. The power of the misdirect is evident in the headlines and key messages of many articles on my desk. Their authors identify “technology” rather than “capital” or “business” as the driver of automation. For example, a recent study by University of Chicago economists examined 15 years of data in 56 countries and found that the labor share of income declined in all but 9. The authors conclude that their results “support the view that changes in technology, likely associated with the computer and information technology age, are key factors in understanding long-term changes” in the labor share.

Another widely cited study by two Oxford researchers argues that “computers increasingly challenge human labour in a wide range of cognitive tasks.” An article published by the MIT Technology Review is titled, “How Technology is Destroying Jobs.” A book by two MIT professors, The Second Machine Age, predicts a new economy of “winners and losers”: “Technological progress is going to leave behind some people, perhaps even a lot of people, as it races ahead…digital technologies tend to increase the economic payoff to winners, while others become less essential, and hence less well rewarded.”

The “multipurpose tool kit”

A much quoted piece in The Economist asserts that a new era of automation “enabled by ever more powerful and capable computers” could lead to massive unemployment. “The combination of big data and smart machines will take over some occupations wholesale; in others it will allow firms to do more with few workers.”

Another example comes from this year’s World Economic Forum, where Google CEO, Eric Schmidt, organized a “fireside chat” for a gathering of 50 elites to deliver the message that “tech related job destruction was just beginning, inequality was going to get worse, and the solution was for the population to educate themselves into being entrepreneurs so they could survive this new era.” Schmidt warned, “The race is between computers and people and the people need to win…In this fight, it is very important that we find the things that humans are really good at.”

Schmidt’s chat suggests that he may have the CIA manuals of John Mulholland, the conjurer considered to be “the magician’s magician.” Though less well known than Harry Houdini, Mulholland was revered within the profession for the exquisite precision and intensity of his craft, especially his skills at “sleight of hand” ––the ability to deceive up-close and personal. “All tricksters…,” Mulholland wrote, “depend to a great extent upon the fact that they are not known to be, or even suspected of being, tricksters…he should be so normal in manner, and his actions so natural, that nothing about him excites suspicion… Trickery depends upon a manner of thinking. It is a lie acted…The objective of the trickster is to deceive the mind rather than the eye…”

Mulholland had been recruited in the early days of the CIA for a top secret R&D program that produced many tricky inventions, including a .22-caliber single shot firing device concealed in a tube of toothpaste, a bioinoculator that shot cyanide bullets from a cigarette pack,a hypodermic syringe concealed in a PaperMate ball point pen, a depilatory-infused cigar intended to make Castro’s beard fall out, exploding seashells, and my personal favorite: “The CIA Escape and Evasion Rectal Suppository” which was a “multipurpose tool kit” including wire cutters, pry bar, saw blades, drill, and reamer. Ouch!

More capital hardware, more redundant workers

CIA brass soon realized that as cool as these devices might be, they were as dust in the hands of ham-fisted field operatives, so in 1953 Mulholland was hired to write a highly classified Official CIA Manual of Trickery and Deception and to train field agents. Mulholland emphasized that deception depended upon the ability to confuse, in order to mislead. He insisted on the importance of stage management techniques and the manipulation of sight lines to strategically misdirect attention. With stagecraft and diversion, he pointed out, plausible reasons can be substituted for facts in order to camouflage real intentions and direct the spectator’s attention away from the lie.

Schmidt’s triumph of fireside stagecraft, presumably intended to invoke associations with FDR’s weekly radio broadcasts, suggests mastery of the conjurer’s art. What’s hidden behind his cloak? First, recalling Newton, there’s the language that directs attention to “the computer” rather than to the hidden business models, assumptions, and executive choices that decide how computers will be used. Then there is another misdirect: the notion that somehow we have to figure out what “humans are really good at.” The implication is that humans are too messy, dumb, unpredictable, and uncontrollable to have a central role to play in the future. Our talents are a mystery to Schmidt. Finally, there is the call to elites to find something to keep the masses occupied, entertained, and above all, diverted from the secret at the heart of the trick. Schmidt’s comments provoke an anxiety that works to distract people from outrage. Instead of looking for the lie, we worry how we will save ourselves and our children from this inevitable wave of displacement and exile.

Recall that the magician’s sleight of hand depends upon the spectator’s close-up focus. What happens to these arguments when we zoom out to see the cyanide pellets and poisoned pens hidden beyond the sight lines of the trick? What secrets hide in this electronic hat? Looking at the bigger picture, an obvious initial question is: who benefits from the supposedly inescapable digital forces poised to take your job?

You’ve probably heard that corporate profits are at a record high, even as the labor share declines and income inequality increases. Yet ever since Henry Ford’s five dollar day, economic models assume that businesses prefer strong demand, even if it means paying higher wages. That assumption lends credence to the notion that technology is the culprit. Low wages, it suggests, couldn’t possibly be the result of executive choice. But economist Paul Krugman wonders if corporations “might not mind a moderate depression.” He posits that each employer is trying to maximize profits by lowering wages or eliminating jobs. Collectively these individual choices add up to more unemployment, as companies invest in capital hardware that can be depreciated rather than in hiring people.

Caricature of companies

For Krugman’s hypothesis to be plausible, it helps to know that the rules of the contemporary business model known as financial capitalism reward CEOs for lowering costs, especially labor costs. Here is one of those invisible forces that pull from beyond our line of sight. It might also help explain a 2010 U.S. Justice Department antitrust investigation that revealed a secret deal between Steve Jobs and Google’s Eric Schmidt to artificially lower employee wages by agreeing not to recruit from each other’s workforce and to share wage information. The illegal deal, characterized by Bloomberg as “incalculable hubris,” eventually included Adobe, Pixar, Intel, and Intuit and reduced workers’ pay by over $9 billion, wealth that instead went to boost the companies’ profits.

Here’s another exploding seashell: CEOs pay is often linked to the stock price of their companies, and analysts mark up the prices of firms that reduce costs and wage bills. This helps explain the Institute of Policy Studies’ finding that in 2009 CEOs in the U.S. who “slashed their payrolls the deepest” took home 42% more compensation than that year’s already stratospheric CEO pay average for the S&P 500.

NBC News reported this story and added a qualifier: “it’s worth remembering that a public company’s chief executive has a fiduciary obligation to maximize value for the owners of a corporation—its shareholders.” That may be standard practice today, but it’s a new twist in the history of capitalism. According to the seminal historian of the business corporation, Alfred Chandler, this new financial capitalism is an exception to the long-established logic of industrial success. In his telling, the supposed inevitability of wholesale labor substitution is the result of a specific and recent history that sharply diverges from earlier practices. Business models that reward short-term moves to lower costs are a hollowed out caricature of the “deep product knowledge and the continuous development of product-specific capabilities in managers and employees” that once made companies great.

Refugees in our own land

All of this suggests that technology isn’t destroying jobs, people are. Business models and economic assumptions are. Greed plays a role. Automation doesn’t necessarily reduce the significance of human presence and problem solving skills. Nor does it inevitably reward winners and delete so-called “losers.” Consider the airlines and the consequences of their business model.

Airlines are examples of economic models that rely upon the automation of human labor to reduce costs and produce closed-loop systems. From purchasing tickets to departure and arrival, one no longer engages with “airline staff,” but rather with a vast computer system and its inscrutable interface. Travelers negotiate an anonymous rules-based leviathan in which human-to-human contact has been eliminated except for a few people left to impose social order. The airlines lowered costs by transferring them to the traveler. One must work online to purchase tickets and manage information, contend with substantial and non-negotiable fees incurred for variances from system rules, and endure the stress of the airport experience where travelers faced with problems, divergence, or uncertainty quite literally have nowhere to turn.

The New York Times recently reported on conditions in the Atlanta airport, where 225,000 daily travelers have no help. So great are their needs, that the airport’s chaplains have ventured into this unmapped territory beyond the edge of the business model. They have assumed the customer service function abandoned by the airlines, attending to people in distress and assisting with everything from missed flights to ticket counter disputes. Those unnecessary “losers” in the airlines’ business model —the employees — were actually the people who added the only human value in this otherwise robotized experience.

The Atlanta airport is a concrete example of the kind of world that some CEOs, economists, and stock analysts have in mind for our future. In such a world we are refugees in our own land, marginalized from the activities that shape the quality and effectiveness of our lives. There’s an analogous situation brewing in the educational sector where online learning is seen as a way to cut costs and eliminate teachers. Networked technologies will allow us to educate far more people at less cost in every corner of the world, but research shows that it’s a mistake to think this will be done with fewer teachers. A recently published Gallup-Purdue study identified the three university experiences that best predict success in life and work: 1) a teacher who made me excited about learning, 2) a teacher who cared about me as a person, and 3) a mentor who encouraged me to follow my dreams. None of these can not be transposed to automated education. There will be many new ways to teach, learn, and configure resources, but all of them will require people—teachers, facilitators, coaches, nurturers, coordinators, visionaries, integrators, supportive communities, and peers. It’s won’t be a robotic world of winners and losers as the models suggest, but rather a rich human world of many winners.

Beyond understanding

Here is another lie in the trick: machine intelligence does not lower the threshold for human skills–– it raises the threshold. Whether it’s programmed financial products or military drones, complex systems increase the need for critical reasoning and strategic oversight in humans. This has been one of the most chilling lessons of the financial crisis. The Final Report of the National Commission on the Causes of the Financial and Economic Crisis in the United States describes the shaky foundations of the subprime mortgage industry: “This entire market depended on finely honed computer models –– which turned out to be divorced from reality…When that bubble burst, the complexity bubble also burst: the securities almost no one understood…were the first dominoes to fall…”

Wall Street firms relied on “quants,” the mathematicians who engineered complex financial products and trading algorithms. Managers themselves did not understand the products or their operations and neither did financial regulators, who “increasingly relied on the banks to police their own risks.”

These human failures plunged the world into a nightmare from which most of us have yet to fully recover. When the magician’s cloak was torn away, it concealed a blank stare and a question mark. The Wall Street firms relied on their robotic digital circuits. They enshrined the algorithms and allowed the human system to drift into a disturbing passivity and dependence. Economic value was destroyed, and the human toll was paid in chaos and pain. It is expedient to disregard these facts, but it is decidedly not rational.

Things that humans are really good at

Finally, what’s left behind the magician’s cloak? It is one resounding thought: the same technologies with which they would send us into exile can empower us to blow the old business models out of the water. “All that we’ve gained the machine threatens, as long as it dares to exist as Idea, not obedient tool,” wrote Rilke.

The much-anticipated robot invasion is premised on an economics of contempt that leads to a dead-end of exclusion and stagnation. Instead, we can raise a new economics of humanity that is drawn by different force fields. It will disclose new occupations, shape new relationships, and celebrate new forms of participation. Behind the magic cloak, the false dichotomy of winners and losers dissolves. We can’t afford to restrict the activities of information-rich learning and doing to an elite. We can all leverage technology and be augmented by its power. An important new research literature on “neural plasticity’ suggests that each one of us is capable of understanding, feeling, and accomplishing far more than the world has ever asked of us or allowed. And yet we continue to imprison our minds and bodies in workplaces, schools, and hospitals whose organizing principles have hardly changed in centuries.
I see a world that has barely scratched the surface of human potential. The digital can help us ignite a new human turn in economic history, as we tackle our most intractable problems in every sector, especially climate, education, and health. Each of these will require people supporting one another in new ways and solving hard problems.


There is no reason other than habit to suppose that market economies can only work one way. In fact, it’s just the opposite. Ours is a broken capitalism that exerts its gravitational pull on the electronic condition. But capitalism’s past successes resulted from its plasticity, its ability to continually adapt to the new needs of new people. There is nothing inevitable or necessary about the current rules of the market game or the politics that it enthrones. This is no utopian thought. On the contrary, it would be unrealistic to think that today’s arrangements cannot and must not be challenged.

Most of us don’t need to “find the things that humans are really good at.” We already know: We are good at being human. In this way we make the world more humane, and we do it better when we have opportunities to learn and contribute. We love, and we do that better when we ourselves are loved and valued. Rilke’s sonnet continues, “But for us existence still can enchant; in a hundred places it’s still Origin.” Let this be our electronic condition. “I sing the body electric,” wrote Whitman, “the armies of those I love engirth me and I engirth them…” Let this be our electronic condition too.

The author

Shoshana Zuboff is the author of The Summons: Our Fight for the Soul of an Information Civilization (forthcoming, 2015). She is the Charles Edward Wilson Professor of Business Administration (retired) at the Harvard Business School and a Faculty Associate at the Berkman Center for Internet and Society at the Harvard Law School.

Quelle: F.A.Z
  Zur Startseite
Ähnliche ThemenEric SchmidtLarry PageSergey BrinCIAMITThe EconomistUniversity of Chicago