Daniel Kahneman -- Thinking, fast and slow ============================================ The parts I found to be most valuable were those that warned of common decision making mistakes or fallacies. Many of these are the mistakes we make when we use what Kahneman calls System 1 thinking where we should be using System 2 think. Roughly this means that we use quick, intuitive, evaluative judgments where we should use slow, calculating, judgments that take into account lists of positives and negatives, possible problems as well as possible benefits, experiences in other similar situations and projects, and so on. Kahneman is a bit of a pessimist, or depending on your point of view, a realist. He is not the person you want present when you are trying to convince a possible funder that the project will be finished quickly, that it will cost very little, that it will pay for itself, that it will be completed as described and without incident, etc. He is also a proud theory buster, or at least he seems to pride himself on criticizing decisions and proposals that are made by someone who is blinded by a theory, especially a cherished and tightly held theory. Here are some of the decision making fallicies that Kahneman warns us about: - The optimistic fallacy -- It's all too easy to become convinced that a proposed project can be finished quickly, that it will be finished below cost, and that the finished project will do everything that the user wants it to do. As a computer programmer, I can say: "Yes, I've seen that before." - The sunk-cost fallacy -- Once you've put time, effort, money, or other resources into a project, it is much harder to back out or drop the project, even when it is failing. Dropping the project means admitting failure; it means acknowledging or accepting your losses; and it means giving up something in which you have serious ego involvement. - The planning fallacy -- This is one where the decision makers consider only unrealistic, best case scenarios and where the decision could be much improved by considering other similar situations and projects (i.e. by taking a outside view). A variety of other considerations should be made in an attempt to avoid the planning fallacy, for example, what might go wrong in the future, and how the world (the competition, the user's needs, etc) might change in the future. One piece of important advice from Kahneman: Learn to use (short) checklists and simple algorithm in your daily decisions. If nothing else, doing so will encourage you to slow down, to think more methodically, to use objective criteria rather that intuitions and likes or dislikes. Try to set your cherished views aside and to avoid group think by considering ideas from other sources. Try to use a check list or set of questions that reveal a small set of objective attributes, then use a simple algorithm (often a straight forward sum is good enough), and finally factor in considerations that are specially to the current project or decision. Over and over Kahneman emphasizes *his* opinion that *expert* opinion and expert judgment is no where near as good as we often believe it is. We all, experts and non-experts alike, think that our judgment is better than it really is. Kahneman also has a reasonably good discussion about predictability, under what conditions the future *is* predictable, and under what conditions "experts" can learn enough to be able to make valid predictions. A rough summary is that (1) the future must not be random; (2) there must be cues and indications from which valid predictions can be made; and (3) the expert must be given the opportunity to learn those cues and what follows from them. 03/06/2013 .. vim:ft=rst:fo+=a: