Free Novel Read

Autopilot Page 3


  We are now using our brains to try to understand our brains. Another unique thing about humans is that we can afford to be lazy because of our technology and culture. We might think that an elephant seal lounging around on a California beach is being lazy. However, nothing could be further from the truth. The seal is preserving precious body fat and energy for when he has to hunt in frigid water or avoid sharks.

  * * *

  How did we become convinced that idleness is evil? Idleness has always been feared in the United States. The Puritans believed that hard work was the only way to serve God. Going back to 16th-century Europe where Puritanism has its roots, Luther and Calvin both believed that constant work was ordained by God and they commanded each person to choose a job and work at it “as a century post so that he may not heedlessly wander about.” Forced labor was even encouraged for the poor and unemployed as a way to keep them on “the path of righteous living.” During Luther’s lifetime, Europe was urbanizing and its population expanding rapidly. This led to overcrowded cities, high unemployment, and inflation. There was an explosion in the number of urban poor in places like London, Venice, and Amsterdam. Unable to grasp macroeconomics, zealots like Luther saw the new urban poor masses as “indifferent idlers” who should be punished with toil for their original sin of laziness.

  We can trace the roots of our current obsession with work and effectiveness to Luther’s misperception that poverty is caused by laziness rather than complex socio-economic circumstances.3 Idleness came to be seen as an evil. If only Luther had been trained as a sociologist, we might have more than two weeks of vacation every year.

  The consequences of Luther’s rabid anti-idleness philosophy, especially in the United States, are seen in our absurdly short vacations and our compulsive work ethic. (Not that the United States is alone in this obsession; the Japanese have even coined the term “karoshi,” which means “death from overwork.”)

  The increase in working hours is also striking given the recent explosion of time management, “get-everything-done-right-now” books and seminars on the market. On Amazon, I counted over ninety-five thousand books on time management. You would need to be very skilled at time management to read all of the time management books on Amazon. Assuming the average length of a book is two hundred pages, that’s nineteen million pages of time management material to read. You would have to read about three time management books a day for seventy-two years to get through them all.

  If these books are really effective at making us more effective, then why are we working more hours? Why does study after study show that we are more stressed, have worse family relationships, weigh more, and are less happy because we are working too much? Does it seem odd that as the time management industry sells more books, the number of hours we work increases? To quote Bertrand Russell, “can anything more insane be imagined?”

  Could it be that we just aren’t getting the message? Do we need even more time management books and Six Sigma courses? That is certainly what the evangelical time management industry wants us to believe. Is it the case that if we could just get more done, we could have more time off?

  On the contrary, I believe there is a fundamental contradiction underlying the relationship between our culture of time management and the number of hours that professionals work. The more effective we become, the more we are pressured to do. It is an endless cycle. It stems from our belief that time cannot be wasted under any circumstances. However, wasted time is not an absolute value like mass. You can only waste time relative to some context or goal. While you are reading this book, you are wasting time relative to your goal of getting to the store before you have to pick up your kids. In fact, from some perspective, you are always wasting time.

  A scientific view of the brain is incompatible with the Lutheran or Christian view of man, and this view is also incompatible with our work ethic. The much-vaunted work ethic is, like slavery, a systematic cultural invention that resulted from a commonly held, but mistaken, idea about human beings. We look back at the slavery system now and think it ridiculous and appalling. It is clear to us now how wrongheaded the very idea of slavery was. One day, we may look back at our work ethic in much the same way. Once we correct certain errors in our beliefs about our brains, our overworked society will appear to future generations as ridiculous and appalling.

  In the early 1990s, Steve Sampson, an anthropology professor of mine, was recruited as a consultant for a Danish computer company. The Danish company was hired by a company in Romania to modernize its operations. The Danes installed computers and an IT department. Everything seemed to function as planned, but a problem arose. After the computer system was activated and the employees were trained, people started leaving work at lunch time. Puzzled, the Danish managers asked why the Romanians were leaving halfway through the work day. The Romanians explained that the computers enabled them to do a whole day’s work in half a day, so when they were finished with their work they went home. My professor, an anthropologist, was brought in to help solve the minor crisis that ensued. The Danes were baffled that the Romanians did not want to do twice as much work now that they had computers, and the Romanians thought the Danes were crazy for expecting them to do twice as much work just because they could do it faster. This example illustrates a cultural gap, but also that technology such as PCs that are ostensibly supposed to give us more free time actually either reduce our leisure time or eliminate it.

  Many of us read the summaries of scientific health studies that appear in popular magazines or the New York Times. Some of us try to implement the suggestions that researchers make about how to eat healthier, how to exercise, how to avoid cognitive decline as we age, how to educate our children, how to sleep better, how to avoid getting diabetes, how to avoid knee problems from running, etc. This book should be read similarly, as a how-to book about how to do nothing. Obviously, the “how-to” part is easy. The “why” part will take some explanation. Idleness may be a loathsome monster, but it’s a monster you should get to know.

  * * *

  From an evolutionary perspective, going back a couple of million years, when homo sapien-like species were beginning to evolve more advanced cultures, one thing that distinguished us from the apes was the ability to plan for the future.

  For example, apes are known to be proficient tool users, but they only seem to use the tools in their immediate vicinity. Chimpanzees often use nearby twigs to lure ants out of a colony. But no chimpanzees have been seen to carry a twig for miles, knowing that they might get hungry later and there might be an ant colony along the way.

  The first hominid species actually started carrying tools to places where they knew the objects would be useful (as opposed to just using tools in the immediate area). This indicates that their brains had developed the capacity to represent the fact that at some point in the future they might want to eat, even though right at this moment they might not be hungry. So rather than being driven by their current state, i.e., hunger, early humans began to prepare for future states.

  This necessarily requires more memory to represent the past and the future. The ability to plan for future states of hunger, cold, or thirst as opposed to just reacting to immediate desires, is perhaps what began the rapid cultural advance of human beings.

  It is interesting to muse about when the concept of work coalesced in human culture. Presumably it would have been after the evolution of language. It is doubtful chimpanzees have any concept of work, but they are very social and there is some evidence that they can plan for the future to a very limited degree.

  Our hominid line broke with chimps about five to seven million years ago, and something starting to resemble human culture began about 1.8 million years ago. Language is more recent. So when did “work” as something onerous and obligatory replace just being active as a function of external or internal stimuli? There must be some higher-order conscious reflection that is necessary to be able to say that you are working as opposed to doing nothing,
or just trying to satisfy your hunger.

  The other side of the idleness-is-good-for-the-brain coin is that our brains come with design limitations. In much the same way that James Cameron could not have made Avatar on one normal computer, an individual human brain can handle only so much information.

  Our brains took millions of years to evolve in very different types of environments than, for example, a modern office. Humans only began reading and writing about five thousand years ago. This is why it is still such a struggle for us to learn how to read. We lack genetically specified neuronal structures for reading, and our brains have to recycle other brain structures when we learn to read. Speaking, on the other hand, evolved much earlier and we normally do not have to struggle to learn how to speak. There are stages to language acquisition that happen whenever a healthy brain develops in a language community, e.g., English, Spanish, or Chinese.

  We have specialized brain structures that are attuned to speech perception and speech production. By the time we reach adolescence, we have mastered our native language without any special instructions. However, in contrast, many otherwise healthy people with normally functioning brains reach adulthood not being able to read.

  I point this out because our modern way of life and our work ethic are much more recent cultural inventions than reading. Swedish neuroscientist Torkel Klingberg calls this “The Stone Age brain meeting the Information Age.” For example, we do not have genetically specified brain structures for multitasking, and studies now show that multitasking makes you worse at each thing you are simultaneously attempting to do.

  In a famous series of studies, Stanford professor of communication Clifford Nass wanted to find out what gives multitaskers their proclaimed abilities. Professor Nass marveled at his colleagues and friends who claimed to be expert multitaskers, people who chat with three people at a time, while answering emails and surfing the web.

  In one experiment, Professor Nass showed a pair of red triangles surrounded by two, four, or six blue rectangles for a brief moment to both high multitaskers and low multitaskers (people who don’t normally try to do more than one thing at a time). Then he showed the same picture again, sometimes altering the position of the red triangles.

  The subjects were told to ignore the blue rectangles and to judge whether the red triangles had changed position. What he found was the low multitaskers had no problem with this task. However, the high multitaskers performed horribly. They could not ignore the blue rectangles and they could not tell if the red triangles had moved. What this means is that multitaskers cannot filter out irrelevant information because their attention is overloaded with whatever tasks they are not doing. In other words, a multitasker cannot actually distinguish between relevant and irrelevant information because the multitasker does not really know what they are doing at any given moment.

  The clearest evidence of this that is an estimated two thousand six hundred deaths and three hundred thirty thousand injuries are caused each year by drivers talking on their cell phones while driving. Multitasking is compulsive behavior that actually leads to a condition very similar to adult ADHD.

  Psychiatrist Edward Hallowell dubbed this condition “attention deficit trait” to describe what happens to chronic multitaskers. He also argues that the way we run our modern work environments contributes to this problem in which normally high-functioning people have difficulty organizing tasks, get easily distracted, and become absentminded. Modern information workers are interrupted on average every three minutes by instant messages, email alerts, or phone calls. It has been estimated that at work you spend anywhere from twenty-five percent to fifty percent of your day just recovering from interruptions, asking yourself “where was I?” A study by Intel found that the effects of interruptions cost them a billion dollars per year in lost productivity. Modern technology can literally make us dumber.

  We can decide to become aware of our limitations and live within them. Removing such stressors makes life enjoyable, leading to a further reduction in stress. As Klingberg points out, “when we determine our limits and find an optimal balance between cognitive demand and ability … we not only achieve deep satisfaction but also develop our brain’s capacity the most.” This process is a positive feedback loop, which also is a feature of nonlinear systems. A big part of this process is to be idle.

  Our bodies were designed for protein-rich diets and long periods of low-intensity physical activity, like walking or jogging, interspersed with idleness. Continually stretching our mental capacity beyond its limits leads to worse job performance, fatigue, and eventually chronic psychological and physical disease.

  The life of a Cro-Magnon was actually more leisure than work. Back then, work was defined as hunting or gathering food. It is generally accepted that the Cro-Magnon ability to be idle led to the “creative explosion” in human evolution. In biological terms, our brains are almost identical to Cro-Magnon brains. Once basic needs are met—food, shelter, protection from elements and adversity—it is no longer necessary to work.

  What follows is an exploration of what our amazing brains are doing when we are doing nothing. My goal is to offer bullet-proof scientific excuses for laziness. But I also present possible neuroscientific insights into the relationship between idleness and creativity. Finally, I hope to hammer the first nails into a coffin for the insufferable time management industry.

  2

  SOMEONE ELSE’S NOISE

  “Uncovering the mysteries of natural phenomena that were formerly someone else’s ‘noise’ is a recurring theme in science.”

  —Alfred Bedard Jr. and Thomas George

  Let us return to our resting brain. The discovery of a resting state network in the brain is very recent. It has been likened to the discovery of the pervasive “dark energy” in the universe.

  Just as it is unsettling to imagine there may actually be a “dark side of the force” that we know almost nothing about, it is spooky to think our brains are doing all this stuff while we sit and stare into space. For much of the history of modern science, what has appeared to be noise actually represents a deeper truth that we do not yet comprehend. In neuroscience and psychology, the brain’s spontaneous activity has been considered noise until very recently. But it could turn out that this noise holds the key to truly understanding our minds.

  Scientists like Buzáki and Raichle estimate that as much as ninety percent of the brain’s energy is used to support ongoing activity. This means that, regardless of what you are doing, your resting brain represents the vast majority of your brain’s total energy consumption. This is also known as the brain’s intrinsic activity. When you activate your default mode network by doing nothing, it becomes robust and coherent. So, somehow our brains seem to violate the second law of thermodynamics which states that left unattended, things in general get messy and lose heat. This is called entropy. It’s why your kitchen just gets messier and messier the longer you don’t clean it. However, the old adage that “the dishes don’t do themselves” does not apply to the brain.

  On the contrary, when you leave important parts of your brain unattended by relaxing in the grass on a sunny afternoon, the parts of your brain in the default mode network become more organized and engaged. In your brain, the dishes do wash themselves if you just leave them alone. It turns out your brain is never idle. In fact, it may work harder when you’re not working at all.

  Eventually, physicists had to accept that if our knowledge of the universe is not completely wrong, then the universe is mostly made from dark energy. Similarly, it is possible that much of the brain is being ignored by cognitive neuroscience and psychology.

  Psychological brain imaging experiments are designed to test brain activation levels during specific tasks in order to find out what certain brain structures are doing during those specific tasks. I previously pointed out that an assumption in brain science is that any activity detected that is not affected by experimental manipulations is just noise. Until its existence was
verified, the brain’s resting state network was usually considered someone else’s noise. Do not confuse this with the myth that we only use ten percent of our brains. What science has revealed is that we use all of our brains, just not in the ways many people assume.

  Only minor perturbations occur in the brain’s ongoing activity during a mental task like adding something to your to-do list. For example, the neural energy required to press a button whenever a red light appears in a laboratory experiment is only a small fraction (as little as 0.5 percent) of the total energy that the brain expends at any moment.

  In contrast, the default mode of your brain uses a far higher percentage of your brain’s total energy. Figuring out just what the brain is doing while consuming all that energy when you are spacing out is precisely what Marcus Raichle and other neuroscientists are beginning to do.

  One of the striking things about our brains is that in terms of energy consumption they are as greedy as Goldman-Sachs. The brain represents about two percent of your total body weight, yet it consumes twenty percent of your body’s energy. It is the biological equivalent of the one percent. In other words, your brain is a pig and it is selfish. This may be why ultra-endurance athletes can start to hallucinate after running fifty miles, or when participating in the grueling bicycle contest such as Race Across America during which cyclists ride almost non-stop from California to Maryland.

  When blood sugar gets low during some insane endurance challenge, for example, and you are sleep deprived, your conscious awareness is the first thing in your body to start experiencing problems. This is true in general and especially during exercise.