top of page
Writer's pictureRobert Yeo

Baloney checklist


A roll of baloney on a wooden board with a sharp knife lying alongside. The baloney has three slices cut from it.


The key to good decision-making is to tilt the odds in your favour before deciding.


The baloney checklist and how to make better decisions


Robust decision-making frameworks, sources of feedback and diverse points of view are all essential to reduce the risk of making poor decisions.


A framework can be invaluable to help vet the quality of decisions and consider how a recommendation or decision has been made, not just the content of the proposals. The common problem is that many decision-makers focus exclusively on content when they review and challenge. I am sure I am not the only one to have suffered through pedantic questioning of the data or even the formatting and spelling within my work. Yet this, more often than not, misses the more significant issues and hence why a systematic review of the decision-making process, one aimed at mitigating biases that may have influenced the recommendations and crucial decisions, is necessary.


The key to good decision-making is to tilt the odds in your favour before deciding.

Carl Sagan (1934-1996), the influential astronomer, astrophysicist and science communicator, offers evergreen advice with his "Baloney detection toolkit." His tools to enhance critical thinking and combat lazy thinking are to be used whenever new ideas are offered for consideration. Below, I add comments to his nine tools to help you to answer your questions and question your answers.


  1. Wherever possible, there must be independent confirmation of the "facts". Corroborate and trust but verify.

  2. Encourage substantive debate on the evidence by knowledgeable people with different points of view. Complex things are rarely black or white or either/or.

  3. Arguments from "authority" should carry little weight. Don't trust so-called "experts". Check track records - don't be a journalist who reports "someone said something."

  4. Spin more than one hypothesis. Think of all the different ways to explain something and, like a scientist, carry out many small, low-risk experiments to test them. Real scientists don't trust the science. They are constantly experimenting.

  5. Avoid getting overly attached to a hypothesis because it's yours. Confirmation bias is real. Fight it - use a challenge (not support) group to improve your thinking and generate new ideas.

  6. Quantify. If whatever it is you're explaining has some numerical quantity attached to it, you'll be much better able to discriminate among competing hypotheses. Distil data and numbers into terms that your audience can relate to.

  7. If there is a chain of argument, every link in the chain must work - not just the premise. Seek out to find the weakness in your idea so that you can refine and improve it. Don't hide it.

  8. Occam's Razor. When faced with two hypotheses that explain the data equally well, this heuristic urges us to choose the simpler one. Making things simple is challenging but better in the long term.

  9. Always ask whether the hypothesis can be falsified. We must be able to check assertions out. The larger the assertion, the more checking is necessary. Untestable propositions are not worth much.


In addition to teaching us what to do when making a decision or evaluating a claim, any good decision-enhancing framework must also teach us what not to do. This helps us to recognise the most common fallacies of logic and hopefully avoid several potentially perilous cognitive biases:


  • Attack the problem, not the person - keep the two separate. This is a solid foundation for any conflict resolution.

  • Do not accept arguments from authority where it is not possible to evaluate it on their merits. Sorry, this is not the time to be trusting.

  • Correlation does not imply causation. A positive or negative association between two variables does not necessarily mean that a change in one is causing the change in the other.

  • The absence of evidence is not evidence of absence.

  • Observational selection, i.e. counting the hits and ignoring the misses.

  • Survivorship bias - don't concentrate entities on passing a selection process while overlooking that that did not. The mutual fund industry has a habit of closing unsuccessful funds, so it only ever reports historical results from those funds which are still active.


Challenge: decision time


Select an important decision that you are in the middle of making or will be making soon. Ideally, one that has been presented to you for your review so you are entirely independent of the people making the recommendations. Ask the following questions to aid in your decision-making:

  1. Is there any reason to suspect errors driven by self-interest?

  2. Have the people making the recommendation fallen in love with it?

  3. Were there dissenting opinions within the recommending teams?

  4. Have credible alternatives been considered?

  5. If you had to make this decision again in a year, what information would you want, and can you get more of it now?

  6. Do you know where the numbers came from?

  7. Is the base case overly optimistic, or is the worst case bad enough?

  8. Conversely, is the recommendation excessively cautious?


Share


If you enjoyed reading this article, please share it with someone in your network who might appreciate it, like a friend, family member, or coworker.


Subscribe


If you liked this article, please subscribe below for more insights. No spam, ever! Just great, insightful content to help you answer your questions and question your answers.

47 views0 comments

Recent Posts

See All

Commentaires


Subscribe to our blog

Thanks for subscribing.

bottom of page