In the previous post, I used the Galton’s ox-weight estimation and IMDb’s polling system to illustrate how the wisdom of crowd works. Some readers may not agree that the two examples are comparable, since the ox-weight estimation is an objective judgment while movie reviews as subjective judgments. While it is true that the nature of the estimation is quite different, the mechanism that moderates extreme values or opinions still works in a similar fashion. In this post, I shall explain why it is detrimental to use advice selectively, and how a variety of advice can be helpful for better decision making.
Many self-help books will probably tell you that you should not take advice from just anyone, and that you need to be selective in who you listen to. When faced with a difficult task of advice corroboration, it is understandable that an intuitive heuristic can help to make sense of all the information more easily. Unfortunately, a heuristic like that may be prone to errors such as egocentrism, confirmation bias and anchoring.
Egocentrism in advice-taking occurs when people value their opinions more than the advice they receive. This happens even when people are given the advice before they encounter the decision task, and even if the decision task is unfamiliar to them (Krueger, 2003) . The act of overvaluing one’s opinions results in what is known as “egocentric advice discounting”, where people may integrate the advice of others, but only by slightly shifting away from their own opinions (Yaniv & Kleinberger, 2000) .
Intriguingly, Harvey & Harries (2004) observed this phenomenon even when judgments that did not belong to participants were falsely labelled as theirs . Believing that those judgments were made by them, participants valued them over other advice that they received. In other words, people do not always use advice appropriately as they can be affected by subconscious egocentrism.
On a similar note, Confirmation Bias may be another reason why the intuitive use of advice is not reliable. Wason’s (1960) discovery of the confirmation bias suggests that people generally seek information to confirm their hypotheses rather than disconfirm them . Klayman & Ha (1987), however, preferred to term this as positive hypothesis testing, as they believed that confirmation bias refers to the attachment towards a default hypothesis regardless of the evidence .
In any case, both confirmation bias and positive hypothesis testing may be at play when it comes to advice-taking. With an initial opinion in mind, it is tempting for people to put more weight on advice that confirms their opinions rather than on those that disconfirm them. This inevitably leads to a biased use of advice that is self-fulfilling and non-comprehensive.
Both egocentrism and confirmation bias take into account that people receive advice with some pre-existing knowledge or opinion. Harvey & Fischer (1997) posited that this pre-existing opinion may act as a form of anchor, which may be adjusted according to advice that is subsequently received . Anchoring is the tendency to rely on the first piece of information that one is exposed to (Tversky and Kahneman, 1974), which is the pre-existing opinion in this case .
However, Harvey and Harries (2004) distinguished this anchoring and adjustment from egocentrism, as they believed that egocentrism is a relatively stable trait while the effect of anchoring is mostly short-term. This especially makes sense for people who do not have any pre-existing knowledge or opinion, as they will still be susceptible to anchoring on the first piece of advice they receive even if they are not egocentric. Evidently, it is dangerous to selectively use advice when one does not have good knowledge of the situation.
* * * * * * * * * *
While people may think that relying on the most knowledgeable advisor should be a safe approach, the fact of the matter is that even such advisors are biased in their own ways, and selectively using advice may result in a judgment that is extreme and ill-informed. This has already been illustrated in the previous post where the Rotten Tomatoes critic reviews often do not match with the film’s true receptivity. But if a variety of advice is received, the advice taker becomes aware that a spectrum of opinions exists, and is less likely to get fixated on any single extreme opinion.
In fact, because of all the nuances contained in subjective information, it may be even more important for subjective judgments to harness the wisdom of crowd than objective judgments. If we consider subjective judgements to contain random and systematic errors like objective judgments, integrating different independent opinions will also likely reduce these errors, and the truth should probably lie somewhere in the middle.
It may just be my wishful thinking, but I hope that with this approach of advice-taking, the amount of polarisation that is plaguing our world today can also be reduced.
- Krueger, J. L. (2003). Return of the ego—self-referent information as a filter for social prediction: comment on Karniol (2003). Psychological Review, 110, 585–590.
- Yaniv, I., & Kleinberger, E. (2000). Advice taking in decision making: Egocentric discounting and reputation formation. Organizational behavior and human decision processes, 83(2), 260-281.
- Harvey, N., & Harries, C. (2004). Effects of judges’ forecasting on their later combination of forecasts for the same outcomes. International Journal of Forecasting, 20(3), 391-409.
- Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly journal of experimental psychology, 12(3), 129-140.
- Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological review, 94(2), 211.
- Harvey, N., & Fischer, I. (1997). Taking advice: Accepting help, improving judgment, and sharing responsibility. Organizational Behavior and Human Decision Processes, 70(2), 117-133.
- Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: heuristics and biases. Science, 185, 1124–1131.