The Frustration with Productivity Culture

Why we’re so tired of optimizing our work lives, and what we should do about it.
A disassembled computer keyboard and other hardware are laid out on a white background.
Among knowledge workers, there’s a growing distaste for the language of productivity and optimization.Photograph from Getty

Early in the pandemic, I received an e-mail from a reader who embraced my writing about the importance of deep work and the need to minimize distractions, but was thrown by my use of the term “productivity” to describe these efforts: “The productivity language is an impediment for me.” Intrigued, I posted a short essay on my Web site that reacted to her message, proposing that the term “productive” could be salvaged if we define it more carefully. There were, I wrote, positive aspects to the idea of productivity. For example, by better organizing administrative tasks that cannot be ignored—paying taxes, filing forms—you can reduce how much time you spend on such drudgery. On a larger scale, the structured “productive” pursuit of important projects, far from being soulless, can be an important source of meaning.

My readers didn’t buy my defense. The comments were filled with a growing distaste for the many implications and exhortations that had become associated with productivity culture. “The productivity terminology encodes not only getting things done, but doing them at all costs,” one reader wrote. Another commenter pushed back against the proliferation of early-pandemic business articles that encouraged workers to stay “productive” even as they were thrown unexpectedly into remote environments: “The true message behind these posts is clear: ignore your growing sense of existential dread, ignore your children, and produce value for our shareholders—or else!” Others advocated for alternative terms, such as “alive time,” or “productive creativity”—anything to cleave the relationship between “productivity” the signifier and all that it had come to signify.

Some of these reactions were amplified because of the unique stresses of the early pandemic, but that alone cannot explain their stridency. A growing portion of my audience was clearly fed up with “productivity,” and they are not alone. The past few years have seen many popular books that elaborate this same point. In 2019, the artist and writer Jenny Odell helped start this trend when she published “How to Do Nothing: Resisting the Attention Economy,” which became a Times best-seller and was selected by Barack Obama as one his favorite books of 2019. This was followed, the next spring, by Celeste Headlee’s “Do Nothing: How to Break Away from Overworking, Overdoing, and Underliving,” then Anne Helen Petersen’s “Can’t Even: How Millennials Became the Burnout Generation,” and, earlier this year, Devon Price’s “Laziness Does Not Exist.” Though these books ultimately present a diverse collection of arguments, they are unified by a defiant rebuke of productivity culture.

A striking element of these books is the degree to which their message is born out of personal experience. Not long after Headlee’s book was published, I interviewed her and asked why she decided to write about this topic. She told me of a TED talk she had given about having better conversations that went unexpectedly viral, gathering over twenty-five million views. “I was inundated with requests for writing and speaking,” she said. She tried to say “no” more often, but found that “the offers got harder and harder to turn down.” She was soon overwhelmed. “I was more stressed out, and more busy, and sick,” she said, describing two prolonged illnesses that laid her low during this period. “That’s what made me realize I was in crisis: I rarely get sick.” Headlee concluded that humans were not wired to maximize activity—she argued that we’re pushed into this unnatural and unhealthy state by cultural influences that aren’t aligned with our best interests, citing “a combination of capitalist propaganda with religious propaganda that makes us feel guilty if we’re not feeling productive.”

It’s understandable that authors such as Headlee, or the commenters on my essay, have become frustrated with the lionization of “productivity”: we’re exhausted and are fed up with the forces that pushed us into this state. But, before we decide whether we need to dispense with the term altogether, we should briefly revisit its history. The use of the word “productive” in an economic context dates back to at least the time of Adam Smith, who used it in “The Wealth of Nations” to describe labor that added value to materials. According to Smith, a carpenter transforming a pile of boards into a cabinet is engaging in productive labor, as the cabinet is worth more than what the original boards cost. As the formal study of economics solidified, “productivity” gained a more precise formulation: output produced per unit of input. From a macroeconomic perspective, this metric is important, because increasing it produces surplus value, which in turn grows the economy and generally improves the standard of living. On long timescales, improvements in productivity can be greatly positive. Writing in 1999, the management theorist Peter Drucker noted that the productivity of the manual worker had grown fiftyfold during the last century. “On this achievement rest all of the economic and social gains of the 20th century,” Drucker concluded. In other words, the increase in productivity is why today most Americans own a smartphone, while a century ago they didn’t have indoor plumbing.

If you accept that increased productivity helps the common good, the question becomes how to reliably achieve these increases. Until recently, the answer to this largely involved optimizing systems. In the seventeenth century, agricultural productivity was increased by the introduction of the Norfolk four-course system, which avoided the need to leave fields periodically fallow. Similarly, the productivity of early-twentieth-century car manufacturing leaped forward with the replacement of the craft method (in which workers moved around a stationary chassis) with Henry Ford’s continuous-motion assembly line (in which the chassis moved past the stationary workers). The relationship between these optimized systems and the people who toiled in them was complicated and often quite dark. The introduction of the industrial assembly line, for example, accelerated the de-skilling of manual labor, and made workers’ tasks more monotonous. Most relevant to this discussion, however, is how these optimization efforts were developed largely outside the scope of the individual employees included in the systems. If you worked on a Ford automotive assembly line, you didn’t need to read about the habits of highly effective people to do your job well.

Then came the rise of knowledge work. By the time this term was first introduced, in 1959, the center of gravity for the American economy had begun moving from fields and factories toward offices, and many of these office-based efforts evolved from rote clerical tasks to more creative and skilled initiatives. The importance of increasing macro-level productivity remained, but the way we pursued these increases changed. Instead of continuing to focus on optimizing systems, the knowledge sector, for various complicated reasons, began to shift onto the individual worker the burden of improving output produced per unit of input. Productivity, for the first time in modern economic history, became personal.

We should not underestimate the radical nature of this shift. Historically, optimizing systems to increase productivity was exceedingly difficult. The assembly line didn’t arrive in a flash of self-evident insight. Ford suffered through numerous false starts and incremental experiments. He had to invest significant amounts of money and develop new tools, including one particularly ingenious mechanism, which could simultaneously drill forty-five holes into an engine block. Now we casually ask individual knowledge workers to undertake similarly complex optimizations of their own proverbial factories, and to do it concurrently with actually executing all the work they’re attempting to streamline. Even more troubling is the psychological impact of individualizing these improvements. In classic productivity, there’s no upper limit to the amount of output you seek to produce: more is always better. When you ask individuals to optimize productivity, this more-is-more reality pits the professional part of their life against the personal. More output is possible if you’re willing to steal hours from other parts of your day—from family dinners, or relaxing bike rides—so the imperative to optimize devolves into a game of internal brinkmanship. This is an impossibly daunting and fraught request, and yet we pretend that it’s natural and straightforward. It’s hard enough to optimize a factory, and a factory doesn’t have to worry about getting home in time for school pickups.

This brings us back to the original question of whether the term “productivity” has outgrown its utility. I don’t think we can abandon the word altogether. The precise economic property that it measures is important: we need to measure it, and we need to continue to seek to increase it. This proposition probably already puts me at odds with the recent anti-productivity movement, which often calls for a resistance to the capitalist imperative toward growth—a stance in which macroeconomic productivity is downgraded in importance. I think this goes too far, because at a large scale stagnant productivity is more likely to be recessionary than utopian. The problem is not productivity, per se, but the manner in which we seek to increase it. I’m convinced that the solution to the justified exhaustion felt by so many in modern knowledge work can be found in part by relocating the obligation to optimize production away from the individual and back toward systems.

We’ve seen shifts of this type occur in isolated pockets. As Fred Brooks documented in his 1975 project-management classic, “The Mythical Man-Month,” by the nineteen-seventies software projects had become too large and complicated to be effectively organized by existing approaches. After a point, he famously wrote, throwing more people at these projects didn’t speed up their completion. To gain more efficiency, better systems were needed. Brooks’s ideas eventually gave rise to the revolution in so-called agile methodologies for software development—such as scrum and kanban—which helped overcome many of the inefficiencies that Brooks had identified. The software industry didn’t increase productivity by demanding more from its engineers; instead, it developed a more productive system to organize their efforts.

This should be the model for increasing productivity throughout knowledge work. Instead of demanding that employees individually produce more, we should instead seek systems that produce more given the same number of employees. This shift might seem subtle but its impact can be enormous, as it frees individuals from the complexity of optimizing output all on their own, and defuses the psychological torment of pitting the personal versus the professional. We should avoid, of course, an overly optimistic view of a systems approach. When others design the rules under which you work, you cannot trust that these rules will be ones that you like. But, as the history of labor politics in other economic sectors teaches us, the advantage of isolating these decisions in clearly defined systems is that they give a target against which to push back. If the response to the inefficiencies pointed out by Fred Brooks was a system in which, say, software developers were given a mild electric shock when their attention wandered, the developers would have revolted. The industry instead embraced agile methodologies that were more efficient but also generally preferred by the programmers who deployed them. (Even Henry Ford had to introduce higher wages to make the monotony of his assembly line tolerable to his workers). These types of management-labor negotiations become impossible when productivity is instead individualized. If it’s up to you alone to get more done, then attempts to moderate your workload can be misinterpreted as laziness.

To effectively discuss these workplace issues, we must be careful to collapse the definition of “productivity” to its precise meaning of maximizing output. It is exactly these optimization efforts that I suggest moving away from the individual and back toward systems. Critically, this specificity retains knowledge workers’ general need to do their job well. Much of the professional self-improvement literature that is often hastily summarized as being about “productivity” really is not, in the sense that it focusses less on increasing your output above all else and more on nuanced goals, such as reducing stress through better organization, making smarter decisions about your time, being a good leader, or producing higher-quality results. These are largely reasonable objectives for individuals to pursue and improve—though, admittedly, the line between unreasonable productivity optimization and reasonable self-improvement can be hazy and not always easy to define. We see this distinction in our software case study. Agile project-management methodologies didn’t alleviate the need for programmers to strive to be better coders, but they did prevent the developers from having to excessively worry about what they should be coding and whether they had done enough. Leaving individuals to focus on executing their work well, while letting scrutinized systems tackle the allocation and organization of this work, might just be exactly the balance needed to allow growth without dehumanization.

Pulling together these threads, it seems that our problem in this moment of overload is not our general interest in productivity but instead the specific targets to which we apply this objective. I’m still happy to use “productivity” when talking about a sector, or a company, or a system, but I’m increasingly empathetic to the resistance among my readers, and among critics such as Celeste Headlee, to applying this term to people. We should strive to be good at our jobs—to work deeply, to be reliable, to lead with vision. But, if our employers need more output for each unit of input they employ, we should be more comfortable in replying that, although we understand their predicament, solving it is not really our problem.


More Science and Technology