My previous posts have been about ways to manage your thoughts. In discussing this subject with a friend last night, I realized that there is a key principle underlying the entire concept of thought management which I’ve previously written about tangentially in “The Trap In Believing Your Thoughts“, but have never laid out in quite this manner.

The principle is that many (even, at times, most) of our thoughts and predictions are either flat out untrue or even crazy! That being the case, it is perfectly legitimate to question them and strive to align them closer to truth (or at least to a version of truth that serves us better than the untrue or dysfunctional way we’ve been thinking).

Dozens of striking examples of how warped our thinking can become without our being aware of the distortion are presented in a book by the Nobel prize winner Daniel Kahneman, published last year, entitled “Thinking Fast, Thinking Slow”. Kahneman writes about our two “systems” of thinking, the first being the unconscious exceptionally rapid operation of System One, a form of mental processing that allows us to immediately recognize that someone is angry, sad, or happy by looking at their expression; or that alerts us as to whether an object is moving towards or away from us; or that allows us to instantly complete sentences like “Two plus two is _____” or “The capital of France is _______”. No conscious effort is required to elicit this knowledge.

System Two involves effortful mental activity – contemplating, figuring out, choosing. Kahneman writes: “When we think of ourselves we identify with System Two, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do”. But he goes on to demonstrate with hundreds of examples how System One “sets the stage” by creating impressions and feelings that are a main source of influence on the explicit beliefs and deliberate choices of System Two.

Here are a few of them:

1) Eight judges spent many days examining requests for parole. Overall, only 1/3 of parole requests were granted. But the judges’ decisions were strongly biased by an unexpected factor: their hunger! Two-thirds of parole requests that were reviewed immediately after the judges had a meal were approved; virtually none were approved just before the next meal.

2) Descriptions of two people, Alan and Ben, were given to a large number of respondents. Alan was said to be intelligent – industrious – impulsive – critical – stubborn – envious. Ben was said to be envious – stubborn – critical – impulsive – industrious – intelligent. Alan was viewed much more favorably than Ben, even though they were characterized by exactly the same descriptors. It was the order in which the descriptors were presented that influenced respondents’ judgments.

3) Respondents were asked to spin a “wheel of fortune” which was set to land only on either #10 or #65. After spinning the wheel, they were then asked “what is your best guess as to the percentage of African nations in the UN?” Those who landed on the number 10 answered 25%, on average. Those who landed on number 65? 45%, almost double!

We have a natural tendency to want to make systems one and two agree (avoiding the discomfort of struggling with two contrary interpretations – referred to as cognitive dissonance, although here the dissonance being avoided is between conscious and unconscious interpretation). But since System One “sets the stage” for our thinking without our realizing it, the carefully thought through conclusions using System Two are often dead wrong.

Kahneman then delves into more “real world” scenarios that illustrate how wrong even highly expert conclusions can be. For example, hundreds of political analysts were asked to predict the likelihood of a particular outcome (would the U.S. go to war in the Persian Gulf; would Gorbachev be ousted in a coup; which country would become the next big emerging market?). 80,000 predictions were tabulated. The predictive accuracy turned out to be below that of making simply random choices!

Experienced radiologists who evaluate chest X-rays as “normal” or “abnormal” contradict themselves 20% of the time when they see the same picture on separate occasions.

In any given year, 2/3 of mutual funds underperform the overall market, even though they are being managed by so-called “experts”.

Kahneman provides detailed explanations for these unexpected outcomes, but the point to keep in mind is that the best thinking of even highly trained specialists is so often incorrect. The best thinking of the average person is unlikely to be any better.

The takeaway I recommend from all this is that because the conclusions reached after thinking about a situation are so often wrong, it is essential to question your thinking, particularly when your thoughts lead you to a place of depression, powerlessness, and victimhood (a place from which you are least likely to be creative, and least likely to develop solutions to the problems that you confront). Question your what-often-feels-like-infallible judgment and make room for thinking in alternative ways!

* An example of a “crazy” thought is when an anorexic looks in the mirror and judges herself fat. But how far away is that from what many people do when looking in the mirror and judging themselves (choose one or more): too old, too fat, too short, too dark, too busty, too unmuscular, etc. For a large proportion of those people, ten years from now they’ll be looking at photos from today and wishing they could look that good.