Pelham’s Law of Cognitive Error: “I am most likely wrong about many things.”
Cognitive science has solidly identified certain traits of human behavior by which people operate habitually at a sub-optimal level. We make frequent cognitive errors, some of which have the further-counterproductive tendency of covering up error itself, making it even harder for us to correct ourselves. Hence, Pelham’s Law of Cognitive Error.
It is a strategic safeguard of awareness, designed for the purpose of avoiding cognitive error by way of keeping one’s own tendency toward error ever in view. But even deeper than that, it’s simply the truth. I am most likely wrong about a great many things! My track record suggests as much, and even though I have corrected many of my beliefs so far, there are undoubtedly more yet to be corrected.
Here are some of the more common cognitive errors and traits about which I believe well-informed humans should be ever vigilant.
- My Side Bias–The tendency to justify one’s own favored position unfairly over alternative positions of equal or greater merit.
Example: A person believes that his favorite party’s candidate won a debate against the opponent even though the opponent made 50% fewer errors of fact and proved his position 73% more often.
- Status Quo Bias–The tendency to assume without good cause that leaving things the way they are is superior to making the necessary changes for any possible alternative. Example: The committee is simply disinterested in hearing recommendations on how to improve the current process for registering new members.
- Attribute Substitution–Choosing an easier-but-dysfunctional answer over a harder-but-functional answer precisely because it is easier.
Example: Many opine that the United States has strayed significantly from its Constitution and the principles upon which it was founded. The difficult-but-functional solution, therefore, would be to reform the practice of government back to its originally-intended role. The easier-but-dysfunctional solution, however, is simply to elect a president from whichever major party is not currently in power, even though neither the candidate in question nor his political party purport to have any interest whatsoever in wholesale reform.
- Cognitive Miserliness–The tendency to resist “spending” cognitive energy, particularly with regard to identifying the best solutions.
Example: Billy makes a routine delivery run for work each week and has been doing so for two years. A more efficient route would save half the time and mileage, but Billy has never bothered to question or to analyze his efficiency, so as to discover the shorter route.
- Ability Bias–The tendency to overestimate one’s own knowledge, skills, and abilities. Example: A typical example of such bias is “planning bias”, by which a person routinely underestimates the amount of time he or she will need to complete a task.
- Memory Bias–The tendency to over-rely on the accuracy of one’s memory. Memory is malleable, meaning that it can be changed over time. Studies demonstrate that memories may be somewhat modified each time they are engaged.
- Expert Bias–The tendency to rely on the assertions of a person regarded as an “expert” simply because the person is so regarded. A problem with such bias becomes obvious when competing experts in the same field disagree as to fact. Just being called an expert, therefore, does not make one inerrant in his assertions.
Example: A doctor prescribes medicine that will later be recalled as being unsafe, yet because the doctor is considered to be an expert, the patient takes the medicine with complete confidence until it is recalled.
- Hearsay Bias–The tendency to take as fact (and even to repeat to others) what one has “always heard”, even though one has never investigated the matter for himself or herself.
Example: “War is good for the economy.” Many people repeat this even though it is demonstrably false with regard to the vast majority of people. (Only a few benefit economically from war.)
- Orthodoxy Bias–The tendency to believe without just cause that an established doctrine is a sound doctrine, simply because it is established. Example: “The time-honored tradition of the teen ministry is a fundamental in our Christian practice.“
These biases and tendencies are well documented, and I find that I operate more efficiently as a human since I learned about them—and particularly since I learned to recognize them at work in my own mind. The 12-step recovery cliché comes to mind as being pertinent here: “The first step in recovery is admitting that you have a problem.” Similarly, this old adage comes to mind as well: “Forewarned is forearmed.”
Since becoming well aware of these tendencies, my life has become, more or less, a general exercise in reassessing what I think I know and abandoning my flawed beliefs as I discover them. The results have been interesting. I find that I become increasingly confident—not in the arbitrary and arrogant way of those who simply decide to believe they are right, but in the way of the man who is confident that his house is secure because he just checked for himself to be sure all the doors and windows are locked.
I find Pelham’s Law to be philosophically interesting because it is an embodiment of having a “sober estimate” of oneself. This idea has roots in several religions and yet, ironically, tends to be generally elusive in those very religions, with few managing to make a regular practice of it. For instance, a great number of Christians routinely overestimate their knowledge of the Bible and are frequently shown to be in error over some point of fact over which they are confident. Similarly, a great many Americans consider themselves staunch “constitutionalists”, yet they have never invested the 45 minutes required to read the Constitution, and they frequently get their facts about it wrong.
I do not want to be like that any longer, and the best way I have found so far to ensure against it is to keep ever in mind the likely possibility that I am making these errors right now!
For a related article, see The Super Witness.