EconTalk — Gerd Gigerenzer

On: Gut Feelings.

Episode: N/A

Date: December 2019

Background: Psychologist and author of “Gut Feelings: The Intelligence of the Unconscious”.

Key Subjects:

  • Gut feelings or intuitions guide many decisions.
    • Suggestions on how to act that arise rapidly from your unconscious mind.
    • Based on information / experience you don’t necessarily know you have.
  • Risk versus uncertainty.
    • Risk: future events occur with measurable probability.
      • Can be quantified based on a priori grounds (we know) or on the basis of empirical observation (experience).
      • Can be eliminated (pooling, insurance).
    • Uncertainty: the likelihood of future events is indefinite or incalculable.
      • Cannot be analyzed based on a priori grounds (too irregular) or through empirical observation (too unique);  probability calculations are impossible or meaningless.
      • Cannot be eliminated or insured against.
  • Faced with uncertainty.
    • Rely on gut, judgment, as there is no outside information on which to rely.
      • Yes: simple, robust rules of thumb (heuristics) that are helpful for prediction.
      • No: complicated calculations that over-fit past experience.
    • [This doesn’t necessarily follow. Relying on gut feelings in a situation of uncertainty is no better than rolling the dice. The need for heuristics has more to do with how you can make the decision making  process easier for humans in daily life situations after you have figured out if there are any ground rules that govern a particular complex system. So the first step is figuring out through quantitative analysis if uncertainty can be reduced by uncovering any potential pattern or ground rule. There may be examples out there (VAR, full body scans) that do a poor job in reducing uncertainty and yield costly mistakes, that doesn’t mean all attempts at better understanding complex systems are doomed.]
  • Fast and frugal trees.
    • A set of hierarchical (sequential) rules for making decisions based on very little information.
      • Usually 4 or fewer rules.
      • No trade-offs: rules don’t cancel each other out.
    • Easy to understand, to change and provide basis for action.
    • Useful when decision making is time constrained and needs to be immediate.
    • Optimizing decisions for four possible outcomes:
      • Hit: detect a signal when there is a signal.
      • False negative: detect noise when there is a signal.
      • False positive: detect a signal when there is noise.
      • Correct rejection: detect noise when there is noise.
    • Order and select the rules to tailor for :
      • Liberal bias: signal is easily triggered – when the cost of a false negative is high.
      • Conservative bias: signal is not easily triggered – when the cost of a false positive is high.
    • [Fast-and-frugal trees demonstrate that in uncertain situations, to develop a simple decision-making heuristic, you first need to perform a fair amount of quantitative analysis (ie, the opposite of a going with your gut reaction) .]
  • Prediction models and the trade-off between bias and variance:
    • Bias: the difference between the average prediction and the correct value.
    • Variance: the variability of model predictions for a given value.
    • Under-fitting: the model too simple:
      • Low variance (good), but high bias (bad).
      • You don’t find the pattern in the data.
    • Over-fitting: the model is too complicated:
      • Low bias (good), but high variance (bad) = over-fitting.
      • Your pattern fits too snugly, too many false positives.
  • Over-fitting can happen when you model complex, uncertain decision environments.
    • Hard to separate signal and noise.
    • The model captures all the noise along with the underlying pattern in the data.
  • Less is more: make model more simple (lower the variance, increase the bias).
    • Avoid high level of false positives, which can be costly.
  • In prediction, you are better off making it “simpler”
    • Heuristics: example of high bias prediction model.
    • [This is an oversimplification and too general. Less is not always more. You need to find the prediction model that best helps you to reduce the decision environment complexity into actionable rules.]
  • Example: using recency.
    • Trust the recent past, not the very fast past.
    • [I think this demonstrates that reducing uncertainty and uncovering ground rules often involves analyzing and understanding the process of change in non-linear systems; understanding its dynamics and based on that, try to derive simple decision rules.]
  • Fit the process:
    • Model what actually happens, rather than an “as-if” description.
      • Helps you to understand the causal process and make appropriate and understandable adjustments.
      • May yield simple heuristics that can be applied consciously or unconsciously.
  • Nudging and the lack of rationality.
    • Intuitions, biases work most of the time.
      • Strongly influenced by context, framing, stories, language.
      • [Something that is often overlooked in laboratory settings that aim to test heuristics.]
    • If people make the “wrong” decision, it’s not a heuristic “error”.
      • The heuristic was correctly applied.
    • Correct the heuristic, not the story:
      • Raise awareness, rather than changing the story / framing (paternalistic nudging).
  • The more you have to do with situations of uncertainty, the more you need to simplify and to make things more robust because you cannot know how the future will be.
    • [Perhaps overly simplistic.]
  • 7/10

Key Takeaways:

  • Uncertainty versus risk.
    • Risk can be quantified and eliminated.
    • Uncertainty is more difficult to quantify, but can potentially be reduced.
  • Prediction and trading off bias versus variance.
    • Simple models: predictions don’t vary much, but you may not discover the pattern in the data (under-fitting: low variance, high bias).
    • Complex models: predictions vary a lot, but you may capture a lot of noise (over-fitting: high variance, low bias). 
  • Decision making under uncertainty.
    • Reduce uncertainty by analysis to discover and develop set of ground rules.
    • Match complexity of model to (dynamics) of decision making environment.
    • Develop decision making tool that balances decision making requirements (speed, accuracy, ease-of-use, etc.)

Worth Listening:

This one covers a lot of ground and provides a lot of food for thought. The concept of fast-and-frugal trees is very interesting and the tool itself seems to capture the heart of the subject matter: in a high-uncertainty environment, through extensive quantitative analysis you can select the variables that help reduce uncertainty and develop a decision making tool that is simple and easy to use.

What this podcast downplays and perhaps overly simplifies is the need to (i) reduce uncertainty through analysis (developing an FFT is quite complex and involves extensive quantitative data analysis; it doesn’t seem to involve gut feelings or intuition), (ii) match the complexity of your prediction model to reflect the decision making environment (less is not always more as is seemingly implied in the podcast; you may be using too many or not enough variables, under- or over-fitting, using the wrong (linear) model to measure (non-linear) complexity, not sufficiently analyzing or understanding the process dynamics, or many other things. If a model is wrong, find a better one: don’t throw in the towel and rely on your gut), and (iii)  calibrate the features of your decision making tool with the decision making requirements (not all tools need to be simple per se. Different situations require different trade-offs between accuracy, avoiding false positives or false negatives, ease of use, ease of change, speed, etc.)

Reduce uncertainty by analysis, not by gut feelings. If there is high uncertainty, reduce it by finding the right analysis to figure if there are any helpful ground rules that help you determine which variables drive the most accurate prediction. For instance, the fast-and-frugal trees may seem simple, but you need to do a ton of math to identify the right variables to use in them. Suggesting that gut feelings are a better (or even good) option to rely on when faced with uncertainty is odd. By chance, your gut would have to pick the right variables out of the uncertainty soup. (As pointed out in one of the comments, gut feelings, learned unconscious competence, has very little to do with either efforts to improve models by reducing uncertainty or efforts to make models easier to use in daily life by developing heuristics).

Match model complexity with dynamics of environment. Less is not always more… The use of language in this podcast is very loose, which can be confusing, especially when talking about the concept of information. Less information in this podcast means using less variables in a model when the model over-fits, ie when it is interpreting noise as signal. It doesn’t follow that you always improve model accuracy by making them more simple when faced with uncertainty (you may make models more easy to use in daily life, that’s a different objective – see the next point). It would make more sense to suggest the need to calibrate the complexity of your model with the complexity of your decision environment, which often seems to involve understanding the process of change in a dynamic systems.

Calibrate decision making tool with decision making requirements. Sometimes models need to be easy to use, sometimes you need to be able to change them fast, sometimes false positives are costly and sometimes you need to avoid false negatives at all cost. There are always trade-off to be made in developing heuristics. Some models may be better at prediction, but not easy to understand, use or change. Fast-and-frugal trees are elegant and easy to use, but it is not a given that they outperform more complicated machine generated decision trees. So calibrate and develop the heuristic that matches the requirements of the situation.

Heuristics can over time become gut feelings. This is highly speculative and perhaps not realistic in highly uncertain environments, but if you need to find a place for gut feelings in this discussion, perhaps there are some regularities that can be uncovered even in certain complex situations and once these regularities are translated into simple decision rules and turned into simple decision making tools, applying the heuristic may become second nature and allow you to develop the type of unconscious competence associated with a reliance on gut feelings.

It may be helpful to point out the conflation of gut / intuition with heuristics / simple rules, as summarized by one of the commenters to the podcast. It highlights again that there really is no place for gut feelings in prediction under uncertain conditions and that heuristics are a way to make the decision making process easier (not necessarily more accurate) in uncertain situations, once all the hard work of quantitative analysis has already been done.

Intuition makes sense when there is:

  • Stable/linear environment
  • Repeated actions
  • Immediate feedback

In this environment, you have learned “unconscious competence”, which can be applied dependably and is triggered unconsciously.

Heuristics are practical ways to handle situations of high uncertainty:

  • Complexity (many interconnected variables)
  • Non-linearity (variables might interact in non-linear ways), and therefore
  • Uncertainty (the probability distribution of different outcomes is unknown)

Leave a Reply