PFYT Planner
Orion Performance Reporting Portal
Video Conference
LPL Accountview
Plan Sponsors
Schwab
Fidelity
Schwab Institutional Intelligent Portfolios
FACEBOOK
TWITTER
LINKEDIN
YOUTUBE

Bias Basics--cognitive bias and the investor

Bias Basics--cognitive bias and the investor

Posted by Tom on Oct 01, 2014

Investment policy and financial planning is a process that goes beyond the math, into issues of perception.

 

When going through the investment policy and financial planning process, we occasionally get the question, “Haven’t we answered that already?”   The answers is invariably “Yes, sort of.”  We often ask the same question in different ways because we are looking at the differences in the answers.  The way the question is presented can change the answer.   Trying to understand the implications of the differences in the answers is part of the challenge for practitioners.   People in general have a surprisingly hard time both understanding what we do, and why do it.   This is nothing to be ashamed of, it is part of our evolutionary wiring.  We are wired to remember certain kinds of information well, and other kinds of information are not on our evolutionary radar screens. 

Knowing where to find reliable water supplies and food sources, recognizing threats or edible plants, finding your way home after hunting or gathering—these are the kinds of things we are wired for.  Other kinds of information, like investments and planning decisions in the face of an unknown distant future have not had adequate time (from an evolutionary perspective) to get wired into our collective brains.  Because of our difficulty with particular kinds of information processing, and the ever present need to process information quickly, our brains developed very particular information processing skills.  It would not be particularly helpful to be making long-term food storage plans on the pre-historic savannah when faced with the immediate threat of a pride of hunting lions deciding that you were dinner.  Our brains are miraculous in the speed with which we can make decisions based upon very sketchy input.  The kinds of connections our brains make and the mental maps we can apply virtually instantaneously to a situation can make our decision process very quick.    We probably call these instances “intuition” or “gut feelings” or we say that some fringe factor in the decision was a clue to what we should do.   Psychologists call these shortcuts “heuristics.”  They are clearly useful to us in a vast majority of situations that encounter on a daily basis, and without them we would probably be completely paralyzed in life.  Imagine having to rigorously consider your choice of coffee in the morning, or even whether to have a coffee, or the implications of taking meeting notes in pencil instead of pen.  Imagine the thousands of decisions you make every day.  Heuristics help you make them.   Unfortunately, in some cases they can lead to some curious results.  

 

Do you know Linda?

Let’s use one example to show how we have intuitions that sometimes run “counter to the math.”  A famous example of one heuristic that can go wrong is popularly known as the “Linda Problem.”  It was originally from the research of Daniel Kahneman and Amos Tversky 1.  The situation given is this:

 

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

 The Linda Problem is a well known cognitive bias experiment.

So based upon what we know about Linda, which is more likely?

a.)    Linda is a bank teller

b.)    Linda is a bank teller and is active in the feminist movement.

Ok….did you choose “B?”  If so, you are not alone.  89% of the study participants agreed with you.    Really, with all that passion for social justice, what else could the answer be?  But wait…if she is a bank teller and active in the feminist movement, she is also a bank teller.  For “B” to be true “A” also has to be true, but “A” can be true without “B.”   The answer has to be “A.”    This is attributed to lots of factors, but common agreement is that it is an example of the “representativeness bias” also known as the “conjunction fallacy.”  It is thought to be the result of the remarkable pattern matching process that our brains use to predict situations.   It is only one of a long and steadily growing list of heuristics (also called cognitive/behavioral/emotional biases—although technically each of these are specific, although similar things) that affect our day to day decisions and can be called on to solve even larger and more complex problems…with mixed results.  We will look at one more, just for fun.

 

Pass the cranberries

For the sake of brevity, and because we will be talking about this one in some other blogs, I will just give a nod to this problem.   Bertrand Russell coined the framework in “The Problems of Philosophy,” back in 1912.  We will use the updated version commonly used today.  On thinking about the brilliant pattern-matching machinery in our heads, we come to the slippery subject of the inference.  We infer lots of information that we do not actually possess.  This is part of the aforementioned wiring.  There are plenty of inferences and deductions that are entirely plausible.  If my dress shirt has a big brown stain on the front, and a similarly stained coffee cup is in my office wastebasket, and I seem grumpy and keep glancing at my shirt, you might guess that I spilled coffee all over myself…with or without help.  You probably would not jump to the conclusion that I had colored my shirt with watercolors.  Of course, that is an entirely possible thing to have happened.  It is just not likely.  It is the old “Occam’s Razor.”  Bertrand Russell starts with the question about whether the sun is coming up tomorrow.  Based upon past history, the answer is ostensibly yes.  That is not at all guaranteed, however.  Remember the opacity of information on the future?  We assume it based upon past history.  As the future is entirely unknown we cannot say with certainty that the sun will rise tomorrow.  That does not mean I will skip setting the alarm.  Where this gets stickier in practice is where else we use this heuristic, which is commonly referred to today as the Turkey illusion (Although Bertrand Russell used a chicken in his example).

The thanksgiving turkey didn't expect his life to change, either.

Imagine life as a Turkey.  You are born and a farmer brings you food, and you peck around with your turkey pals and eat worms and turkey pellets, and have a leisurely turkey life.  In this case you are a clearly domestic turkey.  A pet turkey perhaps, as a turkey you have no concept of these things, but you never have any cause to worry.  No predators stalk you, although the farm cats are a nuisance.  You are protected by a high wire fence, and kept safe and warm at night in a room with all of your pals.  Imagine a fox comes around to threaten you, the farmer kills it to protect you.  Every day is blissful eating and roaming and turkey fellowship.  If a turkey has a notion of love or family or anything like it, perhaps you think of the farmer in such a fond light.  Until just about November 23rd, which is a bad day for you as a  turkey.  You are in the United States, and now you are headed for the Thanksgiving table.  Probably the first stirrings of panic when you see the axe.   In how many ways do you think we infer things about a completely unknown future without pausing to consider the impact of our confidence in our assumptions?  How sure are we about this…really?  We often blithely accept expectations like historic market returns and project them forward with complete confidence.  That seems like a pretty slippery slope to me.  I won’t say ignore them.  I set my alarm with the idea that the sun will come up, but when it comes to market index return expectations, I think you want to be more circumspect.  Don’t even get me started on the returns of any particular “guru.”

 

 

 

Why are we interested in this?

 

As investors, learning to understand that you see the world “through a glass darkly” we must recognize that we are not dealing with simple opacity.  The future is unknown in every sense meaningful from an investment and planning standpoint.   Information about the future is fundamentally inaccessible.  Unfortunately, that is just one aspect of the problem.  Knowledge about the past and about the present are also a bit murky, because data is not the same as knowledge or understanding.  If we want to be better at investing and planning, our collective task is to work with our data about the past and present in a way that helps us to understand the nature of our present decisions despite the opacity regarding the future.  How we process information matters, because we can often make fundamental miscalculations which create a mental framework for future decisions, and that progression might end up heading off down very strange roads.  If so, we will have to either retrace our steps (if we are lucky) or potentially suffer some serious setback (if we are unlucky).   

In the twenty years I have been in the financial services industry, I developed a simple mindset.  Our first role is to try to do everything possible to protect our clients from themselves.  I do not mean this to be insulting to anybody.  Most clients only need a nudge now and again, a few need pretty assertive intervention in the decision making process.   We are obviously constrained by the role we occupy.  We have no power to compel.  Understanding the way people perceive the choices before them with the possible outcomes, is fundamental to helping with tough decisions.   Being able to talk to people and get a glimpse of some of the biases that they are particularly prone to can help us make sure we deal with questions in a way to minimize any harmful effects to the best of our ability.  Our second role is to try to minimize any damage that we could inadvertently cause.  There were certainly advisors in the late 1990s that jumped on to the technology stock bandwagon at any cost, and promoted this excited area to their clients (overconfidence bias, anybody?).  In keeping with that we try to look at all of our advice through a number of filters to reduce the impact of any biases we have.  Third, we try to help our clients use financial tools for what they can do, keeping in mind the implications of any of them backfiring.    We do not think that the tools of behavioral economics eliminate uncertainty, or that they minimize market risk.  We do think they help us see problems in new ways, which might help us collectively make better decisions.  That is why we will be discussing them here regularly.   Each week or so we will examine a different bias, along with some really interesting projects we have under way to make this understanding more applicable.

 

Click HERE if you want to get more of this content as we make it!

 

 "Like" us on Facebook

 

1. Tversky, A., & Kahnemann, D. (1974) Judgement under Uncertainty: Heuristics and Biases. Science 185, 1124-1131

Topics: Investment Policy, Cognitive Science, Unpredictability and Randomness, Tom Posts