"Thinking, Fast and Slow" by Daniel Kahneman ISBN 978-0-3742-7563-1
This book is a powerful panorama for a distinguished academic career spanning decades and disciplines. The author touches mostly psychology, economics, statistics, and biology. The main principle underlying all these topics is the innate duality of the human mind. The author calls it System 1 and 2.
System 1 is the instinctive jump to conclusion, and system 2 is the analytical lazy part. Most of the time, it is the first system providing answers to questions, forming opinions, and guiding our decisions. It is often providing the brain with a story coherent enough (even if it is completely wrong) to fool system 2 into skipping the validation. By stating the questions in an unusual way (but without obfuscation), the researchers for example could fool students of statistics, and even economics professors into answering and behaving completely contradictory to science facts or economic theory.
The book is very very long, but after a slow start, I could not put Kindle down. The author demonstrates over, and over, and over again how the first response gets even the basic understanding wrong! He also does a marvelous job tying our human nature with policy decisions and society outcomes. Bit by bit he takes apart the idea that humans are rational economic agents, proving that we might be reasonable, by definitely inconsistent in our actions and thinking.
Thinking fast and slow in software development
An interesting effect of System 1 pops up in software development and project planning. When there is a difficult question, system one quickly substitutes a simpler question one on the fly. For example, imagine a situation: you have a website. Build times are long, it is not tested very well and you know there are bugs. The website has been around for a long while. Should it be rewritten? The instinctive answer is yes! But notice that this answer is provided by the fast thinking of System 1, and it is actually an answer to a different question: "is the website bad in your opinion?" You have no information about the actualy user's experience, how much it costs to rewrite, etc. So in the absense of all this relevant information the brain quickly jumps to answer to a simpler question.
There are other effects of System 1: optimistic planning, remembering only last experience, over weighting small risks, sunk loss fallacy, individual balance judgements, ignoring regression to mean, anchoring.
Anchoring is an unconscious weight completely random numbers and associations have on us when asked to estimate something, or plan an action. For example, this book review is about 60 lines long. How many people do you think work at my company? Anchoring states that your guess will be strongly influenced by this unrelated number 60. Same principle applies to words and associations. An interesting use for anchoring poped up in another book I am reading. When kicking off a project, one can anchor stakeholders' expectations using System 1 clues, like described by Nishant Kothary in "The Design of People", a chapter in "The Smashing Book 4".