Dynamic solutions for a fluid world™

3 minutes reading time (571 words)

The Powerful Yet Frail Human Mind: Reality, Cognitive Biases, and Collective Hallucinations

Every once in awhile I read a book that really influences me. One such book is The Undoing Project by Michael Lewis (author of several other books that have been made into major Hollywood films – including The Blind Side and The Big Short).

The Undoing Project discusses the insightful and innovative work of psychologists Daniel Kahneman and Amos Tversky which led to a Nobel Prize in Economics. These two gentlemen developed a theory of the mind and identified many cognitive biases which appear to be ingrained in the human species – and consistently fool even highly educated specialists who one would think should know better.

I just came across a popular article which highlights some of these biases: Here are 24 cognitive biases that are warping your perception of reality. If this short article whets your appetite, then I highly recommend The Undoing Project. If you really want to take a deeper dive into this topic, then Daniel Kahneman's book Thinking, Fast and Slow is for you. I am in the middle of reading this lengthy yet fascinating book.

As an example, here is one of the biases called the "Belief Bias":

If a conclusion supports your existing beliefs, you'll rationalize anything that supports it. In other words, instead of willingly looking at new information, we are primed to defend our own ideas without actually questioning them.

Anyone paying attention to the news lately, whether it be related to America's beleaguered President Trump or to Great Britain's ponderous BREXIT negotiations, will see this bias in the supporters, opponents and, especially, the media. Kahneman and Tversky say that we all do these kinds of things all of the time.

Another popular article I just came across is We're All Hallucinating All of the Time. This article says that when we agree about our mutual hallucinations, that is what we call Reality.

As engineers, here is a bias we should be especially careful about – the "Curse of Knowledge":

Once you understand something you presume it to be obvious to everyone. Things make sense once they make sense, so it can be hard to remember why they didn't. We build complex networks of understanding and forget how intricate the path to our available knowledge really is.

When teaching someone something new, go slowly and explain like they're ten years old (without being patronizing). Repeat key points and facilitate active practice to embed knowledge.

I have two recent, personal examples of this. One is a conversation I had at the 13th Pressure Surges conference last month with a well known waterhammer expert who testifies in court cases about waterhammer. He told me how he was challenged by an attorney to write a description of waterhammer "that a housewife could understand" (presumably, a potential jury member). He said he struggled over an entire weekend to do that.

Second is a training class at AFT I created for all the non-engineers called "Introduction to Fluids". This training class had seven modules to it that included waterhammer, compressible gas flow and slurries. I taught the class (for the second time) and it was a fun challenge to try and explain the complexities of what we do to our non-engineering staff.

During these sessions, our non-engineering staff told me I had a knack for explaining things in a simple way. I hope that means the Curse of Knowledge is something I have learned to overcome! 



Visualize Your Goals
Meet the Import Piping Layout Wizard
 

Comments

No comments made yet. Be the first to submit a comment
Guest
Thursday, 27 June 2019