A recent Financial Times column explored the way subtle biases can affect a person’s judgment. Two brothers, Chip and Dan Heath (they are also professors) describe these biases with the acronym “WRAP.”
“W” = Widen our options; avoid viewing decisions as simple “either/or” matters.
“R” = Reality test our assumptions; identify our assumptions and examine their empirical support.
“A” = Attain distance before deciding; limit the potential for emotion to cloud decisions by taking time to think over a decision.
“P” = Prepare to be wrong; recognize human limitations and avoid becoming emotionally invested in a decision that time has proven to be wrong.
The article offers some fascinating examples of how a decision that appears to be the simple product of rational thought can instead be the result of human bias. For example, people are more inclined to follow advice that they pay for rather than free advice, regardless of the quality of the advice. A hotel guest is more likely to use a towel if the hotel says that other guests re-use their towels, whereas telling the guest that such a policy is environmentally-friendly is less effective.
These and other examples of irrational decision-making can cause people to better detect their biases, although completely eliminating such biases may be impossible. People who detect their biases are in a better position to counteract them. Improving their decision-making can in turn produce benefits for their business or employer. For example, recognizing that people irrationally value paid advice more than free advice can lead a manager to be more skeptical of outside consultants’ opinions. Similarly, managers who recognize that even the best business people make mistakes are more likely to recognize when a project has failed and cut their losses, rather than pushing on with a project out of emotional attachment to the sunk costs.