Tuesday, November 22, 2011

Intuition and counterintuition


Intuition:  What’s the deal with intuition these days?  It seems to be on everyone’s mind, a brainworm on the loose.  People are claiming that tablets like the iPad “tap into intuition,” that Steve Jobs had an “intuitive designer’s sense,” George Bush “trusted his gut feelings” about the presence of WMD in Iraq, Kim Kardashian decided that “intuition led me to divorce” after 72 days of wedded bliss,  and the Huffington Post writes “Science says to trust your gut.”  Intuition seems to be more valued than ever, although there seems to have been no recent upgrade in our collective intuitive skills.  Is intuition really just the flowering of some inner, secret power? 

By contrast, the current movie “Moneyball” is about the success of the Oakland A’s team of 2002, a team that was put together with guidance of  some clever statistical analysis by their general manager, a baseball quant-jock if there ever was one.   Even Roger Ebert says the film is about “the war between intuition and statistics.”  Is there really a war going on? 

In his book “Blink,” Malcolm Gladwell talks about intuition as “thin-slicing” experience based on training up your experience base after 10,000  hours of practice in a field.  If you have that much time-on-task, practicing and learning how to operate efficiently in a field, then you’re capable of rapidly assessing a situation based on few, rapidly scanned clues, and somehow coming up with a quick recognition of what’s going on. 

Here's the thing to know:  How much time have you spent doing search?  My best estimate is that I've done about 5,000 hours of search since I first started using Google in late 1998.  That is, I started practicing my search skills around 13 years (or 4748 days) ago.  

If I've done a bit over 1 hour of search / day since then (which seems reasonable), that means I've invested ~5,000 hours of practice.  While that's a lot of time, it's only half of the 10,000 hours that are usually needed for real expertise. 

System 1, System 2: Nobel Prize winner Daniel Kahneman points out that people have two different and parallel systems of thought when they confront  problems.  The cleverly named “System 1” is a fast recognizer of situations in context—it identifies and labels objects, picks up on relationships, and does so by recognition, rapidly, rather than by deliberation.  

Then there’s “System 2,” the slower, more deliberate, symbol-pushing and rational part of our minds. 

As an example, System 1 recognizes that the number sequence 2, 4, 8, 16, 32… is just doubling from one to the next.  It would be System 2 that lets you realize that this is also the powers of 2.  Of course, if you’re a computer scientist, the powers of 2 has become, over many repeated exposures, something that’s a System 1 effect.  For non-CS majors the problem 2 * 256 is a System 2 task.  For CS-majors, it’s a System 1 task—you recognize the pattern and say, “it’s 512…” without thinking much.  In this sense, intuition is what you’ve been trained to expect to perceive.  It is the power of repeated exposure and the accumulation of inarticulate recognition skills.  

About System 1 Kahneman writes in his book Thinking, Fast and Slow
“We are prone to think that the world is more regular and predictable than it really is, because our memory automatically and continuously maintains a story about what is going on, and because the rules of memory tend to make that story as coherent as possible and to suppress alternatives. Fast thinking is not prone to doubt.”

Nor is System 1 particularly good at noticing contradictions.  We swim in a sea of counterintuitions--things that seem to be intuitively correct, but are not.  We have difficulty seeing the sea we swim in simply because we swim in it all the time.  

Some examples:  
  • How is it that clouds made of water vapor yet can float mid-air?  (Water is awfully heavy.)  
  • The world is visibly and obviously flat—yet we now believe that it’s intuitively obvious that the world is round.  (Trust me, historically speaking, that wasn’t obvious at all!)  
  • When we’re on a merry-go-round, our intuition tells us that the force is outward—that centrifugal force is really trying to thrust us radially away from the center, not on a tangent along the direction of travel.  
  • Dense things are typically opaque, except for glass and water, which are “intuitively obvious” exceptions to the rule.  

Counterintuitive: Except that they’re not obvious, it’s just the pattern you’ve seen so often that System 1 doesn’t even pick up on the contradiction. 

Intuitive thinking is primarily what you’ve experienced and on patterns you pick up.  While it’s fast, it’s also errorful.  Dan Ariely has asked hundreds of Princeton undergraduates the following question:  “A bat and ball together cost  $1.10.  The bat costs $1 more than the ball.  How much is the ball?”  It’s a simple question, but around 50% of Princeton seniors get it wrong and say that the bat costs 0.10.  (That can’t make sense.  If the ball is 0.10, then “$1 more than the ball” means that the bat is $1.10 – adding the bat and ball prices together means the bat + ball are $1.20.) 

So, why do they get it wrong so often?  

Because they don’t check their work.  Why?  Because it’s a bit of a hassle, and the answer is so obvious and apparent (and the stakes are so low) that it’s not worth the effort.  This is characteristic of many intuitive answers—it’s so *obvious* that it’s not worth checking. 

Seems to me that there’s not really a contradiction between System1 (intuition) and System 2 (rational) modes of knowing.  Instead, one’s just faster than the other.  Both are useful, both are necessary.  But the great achievement of science has been to tell us that intuition always has to be checked.  In essence, science is the overturning of intuition by slow, painful, careful System 2 reasoning.  Clouds float because cloud-borne water droplets are really, really tiny and are kept aloft by random air movements.

A note from Dan’s System 2:  Cloud droplets average between 1 and 100 microns.  A typical droplet 20 microns in diameter is 4.2 picoliters in volume with a weight of 4.2 nanograms, falling at a terminal velocity of 0.02 mph.  Thus, an updraft at just over 0.02 mph will keep the cloud in the sky.  That’s not much updraft.  Or to look at it another way, if the cloud formed at 10,000 feet, at 0.02 mph it will take nearly 4 days to fall to the ground.

The trick is to know when to use one mode versus the other.  “Trust the force, Luke” Obi-Wan whispers into our ears.  But the same time, you can’t build a pan-galactic empire with massive infrastructures based on your intuitions about mechanical engineering, power grid design, or waste management systems. 

We know there are a number of rather rapid inferences that are System 1.  A while ago Bob Zajonc showed that determining preference is much faster than cognition… that is, it anticipates cognition.  You see this when making choices about which kind of thing you prefer over something else—be it mates, brands of peanut butter or random Chinese characters.  As he said, “Preferences need  no inferences.” It’s fast and gets there before you’ve figured out what you really should want.   That’s what System 2 reasoning is for. 

It strikes me that we have a lot of brainware dedicated to specific fast System 1 type tasks, and even those mechanisms get it wrong much of the time.  We’ve all seen various visual illusions that we KNOW can’t be true—a vertical table that looks much longer than a horizontal table; two patches of color that look very different in a scene, but can be easily shown to be the same color when placed side-by-side. 

The strange thing is that visual illusions persist, even when you know they’re wrong; even when you’ve measure the differences; even when you put the color patches side by side.  Even though you do the measurements yourself, you can’t convince your vision system to accept your measurement as truth.  It’s as though you haven’t learned anything in the last minute of experience.  

(In the example below, squares A and B are the same color.  Use an eyedropper tool to measure the colors and find they're the same. This illusion image is from Wikipedia,  created by Edward H. Adelson, Professor of Vision Science at MIT in 1995.)  



Even though we DO more vision than anything else we do with our brains,  we still have predictable, repeatable mistakes in vision.  How can we be better in domains where we don’t have so much hardwired brain? 

This is the realm of the cognitive illusion, and is even more problematic.  We know that if you set up a program with an opt-out vs. an opt-in choice, the differences in selection rate are huge.  In one European study, the rate of people selecting to be in a organ donation program was massively different: in Sweden the participation rate is 86%, while in Denmark the participation rate is only 4%.  The difference is that one is opt-in while the other is opt-out. 


Dan Ariely points out that when we’re filling out the form we 

“feel as though we’re making decisions… but the person who designs the form that is actually making the decision for you.  You like to think that the options don’t influence us, but that’s not true.”  

The illusion is that you have control over your destiny.  That is, you think your System 2 rational decision-making system will actually express your deeply-held, fundamental beliefs.  “I believe that organ donation is a social good, so I’ll participate.”  That’s the story you might tell yourself.  But statistically speaking, the choice is made for you when the form is designed as opt-out. 

We don’t recognize the illusions we all live with day-to-day, we’ve come to accept them as intuitive and obvious.  Normal is what you’ve grown up with, the non-intuitive, confusing, difficult  high-technology world is whatever was invented after you turned 8, and stopped seeing everything as pre-destined.  

Isn't that intuitive?  


1 comment:

  1. What's the deal with Google Reader mangling your posts—making your line feeds disappear instead of turning them into space? And is it Reader doing the mangling or is your RSS feed mangled by blogspot? I suspect it is blogspot, as I've not seen the same problem with other blogging platforms.

    Incidentally, your System 2 failed on "When we’re on a merry-go-round, our intuition tells us that the force is outward—that centrifugal force is really trying to thrust us radially away from the center, not on a tangent along the direction of travel." There are no tangential forces (other than the friction that is slowing down the merry-go-round and whatever motor or push from a runner that is speeding it up).

    In the frame of a stationary observer the centripetal force keeping the person moving in a circle is directed towards the center pivot. No force at all is involved in moving in a straight line at constant velocity. It is the removal of the centripetal force that causes the person to fly out at a tangent, not the addition of a mysterious tangential force.

    ReplyDelete