Autopilot Page 2
We shall see how each of these scientific buzzwords is involved when being idle and when being creative, and why being idle might make you more creative. And each of the areas mentioned above is an active field of cutting-edge research involving thousands of scientists. There are several excellent references for further reading at the end of this book. Each topic covers whole semesters of graduate courses and for some scientists these areas are their life’s work. But scientists still actually understand very little about how the brain works. What’s more, applying these ideas to the study of the brain is a fairly recent development in psychology and neuroscience. So if you can gain a grasp some of these ideas and how they relate to the brain, consider yourself a person with a scientific edge.
Allowing the brain to rest opens the system to exploiting these mechanisms of nonlinearity and randomness, and amplifies the brain’s natural tendency to combine percepts and memories into new concepts. Anecdotal evidence from writers and artists, as well as recent psychological studies, leads to the understanding that in order to really tap the creative potential of the brain, a complex nonlinear system, we should allow ourselves long, uninterrupted periods of idleness. At a minimum, it is possible that resting is as important for brain health as is directed mental activity, if not more important.
1
THAT LOATHSOME MONSTER IDLENESS1
“Be diligent in your callings, and spend no time in idleness, and perform your labours with holy minds, to the glory of God, and in obedience to his commands.”
—from Richard Baxter’s A Christian Directory
At least since Homer we’ve been ambivalent on the subject. In the Odyssey, the Lotus-eaters lolled around all day “munching lotus” and were both hospitable and seemingly quite content. But they were a threat to Odysseus and his crew. When he arrived at the land of the Lotus-eaters, the workaholic captain sent a couple of his men to investigate the locals. The Lotus-eaters “did them no hurt” but instead offered Odysseus’s men some of their brew, which was so overpowering that the Greeks gave up all thought of returning home. Odysseus, the personification of the heroic CEO, forced the affected men back to the ship and then tied them to the ship’s benches. He recognized that if the rest of the crew got a taste of the drug, they would never leave the island, and ordered the ship to cast off. In Samuel Butler’s translation, “they took their places and smote the grey sea with their oars.”
Despite the Western cliché of China as a country where work, productivity, and industry are enshrined as the greatest of ideals, during Confucian times idleness wasn’t a sub-culture but an integral part of the culture. A Confucian gentleman grew long fingernails to prove that he did not have to work with his hands. Confucianism actually disdained hard work and instead idealized leisure and effortlessness. According to Lawrence E. Harrison, a senior research fellow at Tufts, “for the Chinese, Sisyphus is not a tragedy but a hilarious joke.” Harrison writes that the highest philosophical principle of Taoism is wu-wei, or non-effort, which means that a truly enlightened person either spiritually or intellectually goes about life with the minimum expenditure of energy. In military matters, the ancient Chinese held that a good general forces the enemy to exhaust himself and waits for the right opportunity to attack, using the circumstances to his advantage while doing as little as possible. This is in contrast to the Western idea of trying to achieve some predefined objective with overwhelming effort and force. It is thus paradoxical that in spite of China’s long history of embracing idleness, it’s currently thought of as the world’s factory. This might be because, as a Chinese physicist told me recently, China has only “overcome” Confucianism in the last half century or so.
With the coming of the Enlightenment in the West, as work became mechanized, bureaucratized, and de-humanized, philosophers fought back. At that point, as the capitalist world system started an unprecedented period of expansion, Western culture popularized the concept of the “the noble savage,” one of whose particular attributes was lounging around and eating the fruit that supposedly fell into his lap. The incomparable Samuel Johnson published a series of essays on the benefits of being idle in the periodical The Idler from 1758 to 1760. He wrote that, “Idleness … may be enjoyed without injury to others; and is therefore not watched like Fraud, which endangers property, or like Pride, which naturally seeks its gratifications in another’s inferiority. Idleness is a silent and peaceful quality, that neither raises envy by ostentation, nor hatred by opposition; and therefore no body is busy to censure or detect it.”
But the capitalists could not be stopped. The 19th century saw the advent of the global industrial economy. As human beings came to function like cogs in the complex machine called the factory, Frederick Taylor, godfather of the efficient American work ethic, introduced “scientific management” to capitalist overseers in The Principles of Scientific Management. His goal was to integrate the life of the worker with the life of business, by the means of what was then considered scientific understanding of humans. Taylor sought to increase production efficiency by minutely measuring the time and motion of tasks. Anticipating modern productivity fads like Six Sigma (which we will get to in Chapter 8), Taylor looked to replace each tradesman’s knowledge and experience with a standardized and “scientific” technique for doing work. While Taylorism was and still is hugely popular among the business class, humanists of all stripes were unenthusiastic. In 1920, perhaps in reaction to increasing Taylorization, the concept of the robot—a fully mechanized, soulless worker, physically as well as spiritually dehumanized—was introduced by Czech playwright Karel Čapek. The very word “robot” means “worker” in Czech. The same year, American humorist Christopher Morley published his now-classic essay On Laziness. “The man who is really, thoroughly, and philosophically slothful,” he wrote, “is the only thoroughly happy man. It is the happy man who benefits the world. The conclusion is inescapable.”
Wrote Lenin of Taylor’s philosophy: “… the famous Taylor system, which is so widespread in America, is famous precisely because it is the last word in reckless capitalist exploitation. One can understand why this system met with such an intense hatred and protest on the part of the workers.” Despite seeing Taylorism for what it was, a new technology of exploitation, Lenin adopted many of Taylor’s techniques in organizing Soviet factories.
With the advent of the 1980s and Ronald Reagan, the mantra that productivity was essential to self-esteem took hold. It was good for America, it was good for business. Laziness, on the other hand, was anti-American: as recently as 2012, the State of South Carolina used laziness as a rationale for imposing voter ID laws.2 The lazy don’t deserve to vote. In 1985, Ken Blanchard and Spencer Johnson famously tried to convince employees that “people who feel good about themselves produce good results” in the iconic The One Minute Manager. A watered-down spiritual parable with efficient capitalism instead of inner enlightenment as its goal, the book is about a bright young man who wants to become an effective manager. He travels the world and meets different types of managers, among them the oppressive “autocrat” and the pleasant “democrat,” eventually attaining the transcendent insight that an effective manager cares about people and results. Eventually, the young man meets the One Minute Manager who proceeds to enlighten the young man that managerial nirvana can be reached using three simple techniques: One Minute Goals, One Minute Praisings, and One Minute Reprimands. Eventually, the young man himself becomes an ordained One Minute Manager. He becomes beloved and rich.
Still, idleness persists, despite the rise of “scientific management.” Like the Lotus-eaters, modern thinkers remind us to taste the sweet fruit of leisure and take it easy. Tom Hodgkinson, editor of the annual magazine The Idler, wrote the UK best seller How to Be Idle: A Loafer’s Manifesto. Tom Lutz chronicles the history of idleness in his fantastic Doing Nothing: A History of Loafers, Loungers, Slackers, and Bums in America. For my generation, the movie Slacker by Richard Linklater inspired some of us to drop out of colleg
e. A strong case for kicking back can be found in Véronique Vienne’s The Art of Doing Nothing. Even Bertrand Russell, one of the most prolific mathematicians and philosophers of the 20th century, wrote a book called In Praise of Idleness. In it, he writes, “I want to say, in all seriousness, that a great deal of harm is being done in the modern world by the belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work.”
These books and many others do a remarkably good job (for often being written by self-identified lazy people) of espousing the positive aspects and importance of being idle. Some propose using idleness as just another means to success, others propose being idle simply for the sake of being idle, still others suggest using idleness as a political tool to fight the capitalist system. While I wholeheartedly endorse any reason for being idle, in this book I take the argument for being idle one step further by presenting some recent and surprising neuroscience, which shows just what your brain is doing while you are doing nothing. I make the argument (which may be contested by managers and neuroscientists alike) that doing nothing—really and truly nothing—actually makes your brain function better.
According to legend, it was while lazing in bed and staring at a fly on the ceiling that Descartes, habitually a late riser, conceived of the “X” and “Y” axes that comprise the coordinate grid, now the bane of so many grade-schoolers who lose sleep studying its properties. The greatest breakthroughs in science and the greatest works of art—in short, many of the greatest ideas in history—may not be the result of arduous, persistent labor. Rather, sudden flashes of insight or “Aha!” moments often come during what Rilke eloquently described as the “last reverberations of a vast movement that occurs within us during idle days.” It turns out that there may be a neuroscientific explanation for this.
Rilke could not have known how spot-on his metaphorical use of the word “reverberations” would become almost a century later in modern neuroscience. As we will see, assemblies of neurons in our brains literally reverberate even while we are doing nothing. In fact, some groups of neurons in the “hubs” of our brain’s many networks reverberate more intensely while we are at rest. This is a recent discovery, and one that to my knowledge has not reached a popular audience. This book is about taking the idea of reverberation seriously—and using neuroscience as the ultimate excuse for taking it easy. One of the great paradoxes of modern life is that technology, for all its advantages, is actually taking away our leisure time. We are now wired 24/7. Idleness has become an anachronism.
The “resting-state network” (RSN) or “default-mode network” (DMN), as it is called, was discovered by neuroscientist Marcus Raichle of the Washington University in St. Louis in 2001. This network comes alive when we are not doing anything. Raichle noticed that when his subjects were lying in an MRI scanner and doing the demanding cognitive tasks of his experiments, there were brain areas whose activity actually decreased. This was surprising, because it was previously suspected that during cognitive tasks brain activity should only increase, relative to another task or to a “flat baseline.” This led Raichle to study what the brain was doing in between his experimental tasks. What he discovered was a specific network that increased activity when subjects seemed to disengage from the outside world. When you have to perform some tedious task in an fMRI (functional magnetic resonance imaging) experiment such as to memorize a list of words, certain areas of your brain become more active and other areas become less active. This does not seem peculiar. However, if you are just lying in the scanner with your eyes closed or staring up at the screen, brain activity does not decrease. The area of activity merely switches places. The area that deactivates during tasks becomes more active during rest. This is the resting-state network. Since then, hundreds of papers have been published examining brain activity during rest. The discovery of the default mode network has generated a great deal of excitement and controversy.
Many areas of the brain are specialized for certain functions. For example, the visual cortex processes early visual information and the amygdala generates warnings and helps us decide to fight or flee. The resting-state network is specialized for times when your brain doesn’t have to worry about running from a mugger or checking your iPhone. In the absence of anything in particular to do, the resting-state network lights up and starts talking to itself (i.e., you). This network has a coherent structure in the brain, and there is little variation from person to person. The resting-state network is involved in mind-wandering or daydreaming. The resting network actually becomes active when you are lying in the grass on a sunny afternoon, when you close your eyes, or when you stare out the window at work (if you are lucky enough to have a window at work). Perhaps most interestingly, those elusive “Aha!” moments may occur more often in people who allow their brain’s resting-state networks time to reverberate.
The idea of a resting-state network is a difficult thing for many experimental psychologists and neuroscientists to accept, because a foundational assumption in cognitive neuroscience is that unless you stimulate the brain with an external signal, any detectable brain activity is just noise. How can there be a coherent brain network dedicated to doing nothing? There is an ongoing controversy within psychology and neuroscience about the significance of the default mode network. The brain is viewed by some psychologists as primarily reflexive, driven only by the momentary demands of the environment.
Thus, some scientists believe that studying the brain at rest is a waste of time. Even more extreme is the assumption that brain responses to external events emerge from a so-called “flat baseline.” In other words, what your brain is doing while you are doing nothing could not possibly be interesting from a scientific perspective—if you’re doing nothing, your brain is doing nothing. There are many reasons why these assumptions were and still are hard to dislodge; one of the most powerful is that it is convenient to assume that everything that happens outside carefully-controlled experimental events is noise that the scientist can safely ignore. Another reason may be that most psychologists and neuroscientists tend to be resistant to ideas about brain function that originate from sources outside their fields. The default mode network fits snugly into what is called complexity theory, which we will return to in Chapter 5.
But it turns out that the brain is not just sitting there waiting for the next stimulation. Rather, the brain is perpetually and spontaneously active. It is maintaining, interpreting, responding, and predicting. In fact, the brain uses more energy for spontaneous, intrinsic activity than for accomplishing specific tasks such as multiplying eight and seven, or filling in the cells of a spreadsheet. According to renowned neuroscientist György Buzáki, professor at the Rutgers Center for Molecular and Behavioral Neuroscience, most of the brain’s activity is generated from within. External inputs usually cause only minor perturbations from the brain’s internally controlled program. Make no mistake: external perturbations are critical in order for the brain to develop normally. No brain can develop in isolation; the brain needs to be “calibrated” to the external world through experience. Nevertheless, the brain as a complex system keeps itself in balance through self-generated patterns. As I mentioned, the concepts behind these insights into brain function come from fields outside psychology and neuroscience, such as complex systems science and physics. We are just beginning to understand what the brain’s spontaneous activity really means. We explore the resting brain and its role in creativity in more detail in Chapter 2 and Chapter 6.
What emerges, though, is the idea that perceptions, memories, associations and thoughts may need a resting mind in order to make their way through our brain and form new connections. Eastern traditions have been aware of this through meditative practices for thousands of years. In Buddhism, monks train to calm their minds. Western society has instilled in us a belief that every moment of every day must be filled with activity. Indeed, it is almost a moral obligation in the US to be as busy as possible. I will try t
o show that for certain things the brain likes to do (for example, coming up with creative “outside of the box” solutions) you may need to be doing very little.
When your brain is bombarded with stimuli like emails, phone calls, text messages, Facebook updates, errands, driving around, talking to your boss, checking your to-do list, etc., it is kept busy responding to what neuroscientist Scott Makeig, director of the Swartz Center for Computational Neuroscience in La Jolla, California, calls “the challenge of the moment.” Clearly, it is very important to be able to respond to the moment. Sometimes our survival depends on the ability to successfully meet this challenge. However, if that moment becomes every minute of every day of every month of every year, your brain has no time left over to make novel connections between seemingly unrelated things, find patterns, and have new ideas. In other words, to be creative.
Thinkers such as Bertrand Russell, Rilke, and Oscar Wilde may have been tapping into something that is only now being revealed by modern neuroscience. Each of these thinkers, and many more, asserted throughout their lives that each person could only reach his or her potential through leisure. That may sound paradoxical; after all, we are taught from a very young age some variation of “the Devil finds work for idle hands.” But given the view of our brains that is emerging from modern neuroscience, it may be no accident that as our working hours increase, our mental well-being and physical health decrease.
The human brain is unique in the animal kingdom for its ability to come up with novel solutions to problems. Animals, especially non-human primates, are certainly creative. However, they are only creative within the narrow limits of their own cognitive and perceptual worlds. Humans have invented technology to extend our perception to invisible parts of the electromagnetic spectrum, and soon we may even be able to extend our memory and cognition using neurotechnology. Many neuroscientists argue that humans are unique in the degree to which we are conscious. Humans are the only species that have created a communication system that allows us to create art and acquire complex bodies of knowledge.