“I would while away the hours
Conferrin’ with the flowers
Consultin’ with the rain.
And my head I’d be scratchin’
While my thoughts were busy hatchin’
If I only had a brain.”
– E.Y. Harburg
Over the past 10 years, many books and articles have been written and many seminars and training sessions given with the sole purpose of getting people to “think outside the box.” Team-building programs, creativity and innovation classes all purport to break the barriers that confine intelligent people to routine approaches and mundane solutions to problems. We try to awaken the “inner child” and connect to the “genius within.” As often as not we find our inner child can be as cranky and obstinate as our real protégée, and our genius within is out for a long lunch. Perhaps it is time to take a closer look inside the box.
Nestled within the cranial cavity is a three-pound, jellylike conglomeration of ganglia and nerve cells that has been neatly labeled “the human brain.” Brains are not pretty — most real organs are not. But unlike the heart, even with artistic rendering, the brain does not have an aesthetic appeal. At best it looks like a pile of uncooked sausages. After the initial “yuck,” a child (inner or otherwise) might ask, “Why do we need to have brains?”
Philosopher C.D. Broad said, “The function of the brain and nervous system is to protect us from being overwhelmed and confused by (a) mass of largely useless and irrelevant knowledge by shutting out most of what we should otherwise perceive or remember at any moment and leaving only that very small and special selection which is likely to be practically useful.”
What possible advantage could there be to reducing the intake of stimuli and ignoring most of our environment?
“I’d unravel any riddle
For any individdle
In trouble or in pain.
With the thoughts (I’d) be thinkin’
(I) could be another Lincoln
If (I) only had a brain.”
Consider task of problem solving. In actuality it’s a three-step filtering process whereby we gradually reduce a universe of possibilities into a world of probabilities.
The first filter is called “problem identification.” Here we narrow our focus to a specific goal. It might be a physical objective such as acquiring a new car or an abstract one such as formulating a theory. On an intellectual level, goals fall into one of two categories. We either want to discover/understand or prove/explain.
In the former we seek to establish relationships between the known and the unknown. In the latter we want to establish rules to predict future outcomes. Right from the start we have to make a choice: Do we favor inductive reasoning over deductive or vice versa? This is not to say that in the process of problem solving, we won’t use both but to achieve our goal, we will be inclined to apply one more than the other. Trying to establish rules before we understand the part or whole relationship can lead to overly simplistic solutions. Making sweeping generalizations that set ambiguous boundaries can paralyze the practical application of knowledge.
Once we’ve identified how we will approach the problem, we can move on to the next filter: problem definition. We begin by deciding if a solution to the problem is going to make a significant difference. After all, problem solving requires energy and other resources. If the payback is small or nonexistent, why bother?
Assuming the payback is adequate, we now have to sort through all the data streaming toward us. With our objective in mind, we separate what is relevant from what is irrelevant. If we’re having difficulty making the distinction, we should take a second look at how we identified the initial problem. Otherwise, we can arrange the relevant data into acceptable solutions. Before we’re done defining the problem, we should eliminate the solutions that do not meet our objective and make the described difference.
By the time we reach the final filter, problem solution, we know what our objective is, what is relevant to that objective and why we want to achieve it. Now we must determine whether we can address the problem directly or need to do some preparatory work. If we need subgoals, we have to go back and redefine the problem, prioritizing the relevancies. Otherwise, we can begin to formulate a plan of action.
First, we verify our resources. If they are inadequate, the possible solution has become improbable. Rather than hoping for a miracle, we should go back to problem definition and see whether we can come up with something more workable. If all systems are good to go, we can solve the problem. But before we’re through, we should confirm that the problem we solved was the one we intended to solve. If the solution doesn’t match the objective, we have misidentified and solved the wrong problem. Somewhere along the line, we failed to apply one or more of the filters. We’ll need to start over and re-examine our objective.
If the solution meets our (previously identified) objective, we’re done — the problem has been adequately and satisfactorily solved. We can move on to new challenges.
Most of the time, most of us neglect, forget or ignore one or more of the filters because we can get away with it. Not all problems need to be strenuously identified or defined before they can be adequately solved. These are problems that have been successfully solved and have established routines for achieving the desired result.
These tried-and-true methods might be things we’ve discovered for ourselves such as the quickest route between home and office, or they might be systems that have been imposed on us such as arithmetic. Either way, the problem-solving process has been structured and is familiar. We know what objective we want to achieve, we know what is relevant to achieving that objective and we know what must be done.
An example of a structured problem is: 17 + 124/12 – 16 * 108 = ?. The significance of the symbols and the priorities of calculation have been laid down as the rules of mathematics. All we have to do is remember the arithmetic we were taught in grade school. If we’ve forgotten the rules — whether we should work from left to right or divide and multiply before we add and subtract — we face a semistructured problem. We know the objective is to find a number, but we need information about how to prioritize the procedures before we can come up with a satisfactory solution.
“Oh, I could tell you why
The ocean’s by the shore.
I could think of things
I never thought before.
And then I’d sit
And think some more.”
To add or subtract, multiply or divide, calculate the circumference of a circle or the trajectory of a missile, we have to learn to think differently. We have to imagine events that do not yet exist. This is unstructured problem solving — the domain of innovators, artists and Nobel Prize winners. Here, problems are discovered and brainstorms spawned. We all visit this rarefied atmosphere, if only in our dreams, but we never stay for long. Although it is exhilarating, it is also exhausting. Generating new ideas becomes increasingly more difficult as we become better at what we do because our brains have evolved to emphasize efficiency over effectiveness.
Development of expertise in any subject requires time, rehearsal and opportunity. But exposure to information, no matter how intense, is not enough. For information to be of use, it has to be linked to an objective and a procedure for achieving that objective. Thoughts turn into actions when nerve impulses activate muscles in an organized manner. Without synchronization, muscle movement is reduced to tics and twitches. We would not be able to walk, talk, chew, write a check, kiss, smile, scream, juggle, cook — in short, do anything but vegetate.
With our brains being continually supplied with new materials and with our inherent ability to reorganize themselves in response to environmental influences, one would expect learning to be relatively easy. Not true. Over time and with experience our neural networks can become so well-trained that we react without any conscious awareness of what we’re doing. We easily can get locked into a habitual way of thinking, if we don’t recognize how we are filtering and organizing the stimuli coming our way.
“I’d not be just a nothin’
My head all full of stuffin’
My heart all full of pain.
I would laugh and be merry.
Life would be a ding-a-derry
If I only had a brain.”
Keith Stanovich invented the term “dysrationalia” to describe “the inability to think and behave rationally despite adequate intelligence.” Barbara Tuchman labeled the self-destructive behavior of otherwise intelligent people as “folly” if bad decisions were made because opportunities to make better decisions had been missed. Robert Sternberg cataloged 20 stumbling blocks that prevent intelligent people from making good decisions and thereby achieving success. David Perkins declared that habituated thinking is a “default [that] happens automatically when no special action is taken. Much has been written to lament the price we pay when our brains function efficiently but not effectively.
Efficiency usually is taught through repetition and rote learning. As has been said, our brains are well-designed to function efficiently. When we don’t have a clear understanding of what we want to accomplish and why a particular action is necessary, we get locked into our mind-numbing routines. Relevant details blur into irrelevant ones. We haven’t learned to differentiate significant matters from inconsequential ones. When something goes wrong, we’re at a loss for what to do. Either we deny a problem exists, or we act impulsively, trying to set things right again. In blind desperation we’re reduced to trial-and-error behavior. If the problem is somehow resolved, we might not remember what we did or understand why it worked — we haven’t learned anything that will improve future decisions.
Experts are more adept at decision making than experienced workers because through immersion, they’ve acquired rules, as well as routines. Rules provide logical structures for organizing information. Structures reduce the need to sort through vast quantities of information. They focus attention on relevant details, enabling the experts to combine new experiences with previous knowledge.
Expertise is more than experience — it is the internalization of the reasoning process that governs a discipline. When we describe someone as having an aptitude for mathematics or the law, we are referring to his or her preference for the “if-then” process of conditional and causal reasoning. The economist and the historian prefer to view the world in terms of cost/benefit strategies, aka part/whole relationships. Biology and much of medicine, however, require the ability to categorize and classify.
For the most part, rules for reasoning are taught implicitly and learned subconsciously. The first four years of college prime the student’s mind for a particular way of thinking about the world. The classes that we breezed through tend to be those that reflect our worldview. Those that we struggled with are modeled on other systems of thought.
Research at the University of Michigan and the University of California has demonstrated that as little as two years of study can greatly influence our perspectives, expanding them in some ways and limiting them in others. Expertise can impede our ability to communicate with others outside our profession, and it can prevent us from empathizing and understanding other equally valid points of view.
To be truly effective, we need more than specialized training and creativity exercises. We need to look at our fundamental thinking style, that is, examine how we organize information and set about solving problems. Only through this understanding are we able to change our perspectives on demand. Thinking about the box is the key (first step?) to freeing yourself from its confines.
Donalee Markus holds a doctorate in administrative and management sciences, as well as a master’s degree in curriculum development. Lindsey Paige Markus, J.D., holds a master’s degree in international finance and economics. They can be reached at email@example.com.Filed under: Learning Delivery