Introduction : Critical Thinking And Statistics/Probability
RiteshPratap A. Singh
The field of critical thinking deals with the objective analysis of facts to form a judgement. A major aspect of critical thinking deals with the identification of various cognitive biases and logical fallacies that skew our judgement.
For example, our emotions are well known to affect our judgements; it is important to be rational while analysing the data. While critical thinking is required in every aspect of life, it is especially important in statistical inference.
A cognitive bias is erroneous evaluation, reasoning, remembering, or other cognitive process, often occurring as a result of holding onto one’s preconceived notions, emotions, preferences and beliefs regardless of information in contrary. A number of cognitive biases are known to affect scientific inferences.
Confirmation bias is the tendency to interpret/recall/search for/favour new pieces of evidence or information in a way that confirms one’s preexisting beliefs or hypothesis.
For example, consider tree of life in taxonomy; Archaebacteriologist Carl Woese elevated to the rank of Archaebacteria to the top ‘domain’, while protistologist Cavalier-Smith did not agree; he instead elevated his pet taxa protists to two kingdoms.
Another example is that if you are a vegetarian, chances are high that you prefer reading articles or books that confirm advantages of being a vegetarian rather than non-vegetarian.
If you are an Israeli, you trust only Israeli media and distrust Palestinian media while objective reality is independent of your preferences (or even consciousness). It is a basic human tendency to stay in one’s shell and “thinking out of the box” is not intuitive.
A related bias is ‘cherry picking’ in which we selectively pick pieces of evidence/data that confirms our pre-existing beliefs or expectations.
For example, during drug testing, a chemist deliberately chooses subjects in which the drug worked and conceals subjects where the drug did not work to increase the statistical significance.
Another cognitive bias is ‘positive results bias’. Authors are more likely to submit, and publications are more likely to accept, positive results rather than negative results. While reviewing literature it is important to note that a prevalence of positive results doesn’t mean the real prevalence of positive results.
Other noteworthy cognitive biases include anthropocentrism, optimism bias, hyperbolic discounting, status quo bias, negativity bias, and so on.
A number of logical fallacies are also known to cause biases in our judgements and scientific inferences. For example, consider the following argument by a Chinese scientist: “Chinese traditional medicine is effective as all Chinese believes in them and it is part of the Chinese tradition for thousands of years”.
This argument suffers from two logical fallacies; the first one is called argumentum ad populum — appeal to widespread beliefs. A belief being widespread do not necessarily make it valid or cogent. Another widespread belief: “you get cold (the illness) in the cold weather” is, in fact, fallacious argument.
Yet another logical fallacy in the earlier argument is argumentum ad antiquitatem — appeal to tradition. A custom that is part of the tradition does not make it automatically valid. For example, consider chewing paan (betel nuts) and spitting out in public places; very well part of the tradition, yet does that say anything about its social/hygienic appropriateness? A logical fallacy called dicto simpliciter (generalization) is also related to this.
For example, ‘logging (cutting down the trees) is anti-environmental’ is a popular generalization that has no scientific basis. Logging is indeed necessary for the healthy management of planted forests.
Another logical fallacy often employed by the pseudoscience quacks is argumentum ad verecundiam– the appeal to authority. Consider the following proposition: “Do you know world-famous American scientist Dr Deepak Chopra? He has confirmed the efficacy of traditional and alternative medicine through quantum physics” etcetera.
Just because the twenty-three times Olympic gold medallist Michael Phelps tried Chinese pseudoscience practice called ‘cupping’ does not validate it as a science.
Another form of the same trick is pointing the reference to a book. Just because something has been printed on a good-looking book (for example, the ‘bible’ of homoeopaths Materia Medica often comes as gold-bound, with golden backbone, and Victorian-looking fonts) does not mean what is claimed is a fact. In science, there is no authority.
Science is open to criticism and improvement; anyone can falsify anyone else through evidence-based rigorous methodology. Thomas Henry Huxley once said: “The deepest sin against the human mind is to believe in things without evidence.”
Investigators should also avoid many everyday propagandas by understanding basic psychological trap liars create to cheat the gullible; it is called ‘illusion of truth.’
Nazi propagandist Joseph Goebbels once famously shared his tactics post-holocaust: “repeat a lie often enough, and it becomes truth’. In philosophy, this is called logical fallacy of argumentum ad nauseam- the appeal to repetition.
Uneducated laity often falls for small lies interspersed between truths here and there, that turns itself into a monster lie, propaganda. Another of their strategy makes use of a trap known as ‘philosophical burden of proof’ (also known as ‘Russell’s Teapot’).
British philosopher Bertrand Russell once famously wrote about one such a claim that he invented; there is a lone teapot that orbits the sun between earth and mars.
Can anyone disprove this assertion? Proponents of pseudoscience make such unverifiable, unfalsifiable claims, and ask the sceptics to prove otherwise (which is impossible to disprove).
As we will later see, hypothesis testing in statistical inference is designed such a way to get rid of this problem by the initial assumption of ‘null’ hypothesis of no effect; this is called in philosophy ‘argument by contradiction.’
Ignoratio elenchi- missing the point- is yet another logical fallacy known to affect our judgements. Perpetrators of pseudoscience formulations often claim that their remedies are ‘government certified’. On close inspection, you will understand that these government certifications are what is known as GMPs (Good Manufacturing Practices).
If any commercial item manufactured in hygienic surroundings with proper quality controls, government issues these GMP certifications. What is being missed here is, however, this certificate has nothing to do with the efficacy of the remedy. The certification is not an attestation that the drug ‘works’ (i.e., ‘evidence-based’).
Even if the government market such products, no one can take for granted that these remedies are effective.
Other logical fallacies include argumentum ad hominem-personal attack. For example, “the person who makes the claim is a Pakistani, so the claim must be false!“. The false dichotomy is yet another well-known fallacy; while only two options are presented, there might be many intermediate or other options available.
For example, species concepts in biology arbitrarily classify members of individuals into one or the other species, while ignoring transient forms-those that are still under the process of speciation, as well as hybrid forms.
In linguistics, two languages are in fact separated by fine gradations of dialects and pidgin, so arbitrarily grouping these variants into either of the two languages is way too simplistic and convenient, but far from the truth.
The slippery slope is a consequentialist (it pertain to the consequence of the action) logical fallacy in which the proponent argue that a small initial action would necessitate series of chain reactions to culminate in a significant negative effect.
For example, teaching evolution in school would make children believe that they are one among other animals. This would cause them to reconsider morality. Finally, they would become immoral!
There are multiple subconscious psychological effects that influence our decisions. Our mind often takes shortcuts for problem-solving, the so-called heuristics.
While evaluating specific topic or concepts, our mind relies on the immediate examples that come to our minds, the fallacy of ‘availability heuristics.’
For example, when presented with a question such as which among the following is more likely to cause death: watersports, terrorism or aircraft crashes, most are likely to choose second or third choice because aircraft crashes and terrorism stories are likely to grab headlines in media rather than death by drowning. In reality death by drowning far outnumber the combined death from the rest two.
Yet another subtle subconscious effect is called Hawthorne effect in which subjects modify their behaviour simply in response to the fact that they are being observed, not in response to any particular experimental manipulation.
For example, workers at Western Electric factory were subjected to an experiment to assess changes in productivity in response to an increased light intensity inside the factory.
The productivity was observed to have significantly increased. When the light intensity reduced, the productivity again increased. In reality, productivity had nothing to do with the light intensity; it changed simply in response to the psychological effect that the subjects are under observation.
Yet another bias is called ‘bandwagon effect’ in which people do things because other people in their social circle does it. In the case of ‘bystander effect’, individuals are less likely to offer help to a victim when other people are present.
For example, tens of people would simply stare at an accident victim thinking someone else would initiate helping the person or call the ambulance (diffusion of responsibility), but in reality, no one does anything. In case of “Anchoring”, the first thing that we judge influences our judgement of all that follows.
For example, a very high initial price set by the sellers is a common strategy during negotiations so that they can marginally bring down the price still enjoy huge profits. In “Survivorship Bias”, investigators consider survivors alone, disregarding non-survivors.
For example, a bizarre article in 1987 that concluded that cats falling from higher stories survive the fall better. As the study was based on veterinarian’s registry, only those cats that have survived the fall and are taken to clinics are included (obviously not those which die from the fall).
Among cats that survived the fall and was injured, falls from higher floors had more patients (survivors). In the case of social desirability bias, respondents are likely to answer questions in a way that affirms actions that are generally considered acceptable to the established social norms.
For example, in questionnaire surveys asking ‘how often do you masturbate’, responses are far likely to be underreported, as the activity has a perceived social stigma associated with it.
Yet another cognitive effect is called ‘placebo effect’ that results from the belief that one has been treated rather than having experienced actual changes due to physical, physiological, and chemical activities of a treatment.
For example, doctors disbursing an unlabelled tablet from his cabinet informing the patient that the tablet is some arcane new generation therapeutic; while in reality, the tablet was nothing but a multivitamin or something similar devoid of active moieties.
Patients, especially hypochondriac personalities, might in fact significantly improve the symptoms if the symptoms had psychological involvement.
Alternative medicines exploit this placebo effect. To ward off psychological involvement, a placebo is often used as a negative control in drug testing. If a placebo is used and experiment designed to appear exactly as in the case of drug candidates, such a treatment is called placebo treatment.
Two informal logical fallacies worth a special mention here. In Straw Man Fallacy, instead of attacking or refuting the main argument or problem, the proponent construct an unrelated (often superficially similar) argument or problem and refute it.
For example: Anti-vaccination camps saying “You don’t care about autistic children. As you don’t care about autistic children, vaccines are bad”.
Here, the argument is not about the efficacy of vaccines; instead, it makes an implicit false assumption that vaccines cause autism, and attacks the basic premise of that assumption.
Texas Sharpshooter Fallacy got its name from a cowboy who randomly shot on his barn in Texas. Looking at the cluster of bullet holes, he picked the area with maximum density of holes, and painted a bull’s eye around it, to claim that he had been a sharpshooter.
This is related to ‘hindsight bias’ in which there is an inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.
Texas sharpshooter bias is common in statistics when differences in data are ignored, but similarities are stressed, leading to the problem of clustering illusion.
The issue is also related to P-hacking in which an investigator changes significance level after getting the P value to make the obtained P value statistically significant.
Statistician James L Mills famously said: “If the fishing expedition catches a boot, the fishermen should throw it back, and not claim that they were fishing for boots.”
In addition to psychological biases, there are several statistical fallacies that are discussed in later modules. These include statistical confounding, gambler’s fallacy, prosecutor’s fallacy, defence attorney’s fallacy, hot-hand fallacy, clustering illusion and so on.
A prudent student of statistics is expected to be aware of these multitudes of subconscious biases that might influence during any step of scientific methodology, including experimental design, data collection, data analysis and interpretation of results.
- Some important biases pertinent to the field of statistics are Confirmation bias, Cherry picking, Positive results bias, argumentum ad populum, argumentum ad antiquitatem, dicto simpliciter, False dichotomy, argumentum ad verecundiam, Illusion of truth, Ignoratio elenchi, Availability heuristics, Social desirability syndrome, Bystander effect and Hawthorne effect
- It is essential that students of statistics be aware of these subconscious biases to make sure that their conclusions and judgements are not affected by them.
RiteshPratap A. Singh
| Data scientist - R&D | AI Researcher| Bioinformatician | Geneticist | Engineer | Yoga practitioner | Writer-Editor | Mathematics and Psychology apprentice | On a mission to prevent Crime, Disease, and Disaster.