Critical thinking is:
Critical Thinking is Hard.
Critical thinking is "the objective analysis of facts to form a judgement." To do that well requires three broad sets of skills:
1) Skeptical Clarification
2) Logical Thinking
3) Humble Self-Reflection
These are not skills that come easily to humans. Good communicators must anticipate failures of critical thinking among their audiences.
The human mind did not evolve for optimal critical thinking. It evolved to make fast decisions. Institutions, like science, have helped society think more critically, but this kind of thinking generally does not come naturally to individuals. Decades of research in cognitive psychology and other fields have shown dozens of systematic ways in which people think and act with less than perfect rationality.
Help people get the full picture
Help people get the full picture on nutrition topics and issues, by clarifying misconceptions and raising awareness of scientific evidence.
The Fact Checker (Column published by the Washington Post): "The purpose of this website, and an accompanying column in the Sunday print edition of The Washington Post, is to “truth squad” the statements of political figures regarding issues of great importance, be they national, international or local."
FactCheck.org (Website managed by the Annenberg Public Policy Center of the University of Pennsylvania): "A nonpartisan, nonprofit “consumer advocate” for voters that aims to reduce the level of deception and confusion in U.S. politics."
Help people prioritize the truth
Help people prioritize accuracy of information, by reminding them to read skeptically because not all information is accurate, and not all sources are reliable.
Calling Bullshit (Online course and blog by Carl Bergstrom and Jevin West): "The aim of this course is to help students navigate the bullshit-rich modern environment by identifying bullshit, seeing through it, and combating it with effective analysis and argument."
"Don't Want to Fall for Fake News? Don't Be Lazy" (Wired article by Robbie Gonzalez citing David Rand): People are often lazy when they read social media. Psychologist David Rand argues that we can help people be more cognitively careful, saying "I think social media makes it particularly hard, because a lot of the features of social media are designed to encourage non-rational thinking."
Help people understand evidence quality
Help people understand the difference between high-quality scientific evidence and low-quality scientific evidence.
For example, people often confuse correlation, two things that have a linear relationship, as strong evidence of causation, one thing causing the other.
Show Me the Evidence (Vox column by Julia Belluz): "Go beyond the headlines for a deeper look at the state of science research around the most pressing health questions of the day."
Help people understand tradeoffs
Communicators should try to make it easier for people to consider the magnitude of some risks vs others in their lives. Without that absolute sense of risk magnitude, people will find it difficult to understand the tradeoffs that their choices entail.
Against Empathy: The Case for Rational Compassion (Book by Paul Bloom): "Basing his argument on groundbreaking scientific findings, Bloom makes the case that some of the worst decisions made by individuals and nations—who to give money to, when to go to war, how to respond to climate change, and who to imprison—are too often motivated by honest, yet misplaced, emotions. With precision and wit, he demonstrates how empathy distorts our judgment in every aspect of our lives, from philanthropy and charity to the justice system; from medical care and education to parenting and marriage. Without empathy, Bloom insists, our decisions would be clearer, fairer, and—yes—ultimately more moral."
"Evidence for Absolute Moral Opposition to Genetically Modified Food in the United States" (article by Sydney Scott, Yoel Inbar, Paul Rozin): "In a survey of U.S. residents representative of the population on gender, age, and income, 64% opposed GM, and 71% of GM opponents (45% of the entire sample) were “absolutely” opposed—that is, they agreed that GM should be prohibited no matter the risks and benefits. “Absolutist” opponents were more disgust sensitive in general and more disgusted by the consumption of genetically modified food than were non-absolutist opponents or supporters."
Gaps in Knowledge
People overestimate how well they understand the details of a process.
The Knowledge Illusion (book by Steven Sloman and Philip Fernbach): "The human mind is both brilliant and pathetic. We have mastered fire, created democratic institutions, stood on the moon, and sequenced our genome. And yet each of us is error prone, sometimes irrational, and often ignorant. The fundamentally communal nature of intelligence and knowledge explains why we often assume we know more than we really do, why political opinions and false beliefs are so hard to change, and why individual-oriented approaches to education and management frequently fail. But our collaborative minds also enable us to do amazing things. The Knowledge Illusion contends that true genius can be found in the ways we create intelligence using the community around us."
"Education" is not Enough (Blog published on Thinking and Eating): Dr. Jason Riis blogs about a recent article discussing the use of conversion messages in influencing GMO acceptance.
The scientific method is explicit about the uncertainty that we should have. However, few individuals, even scientists, consistently succeed in acknowledging all of the uncertainty in daily life or in nutrition science.
Superforecasting: The Art and Science of Prediction (Book by Philip Tetlock and Dan Gardner): "In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are superforecasters."
"Do People Inherently Dislike Uncertain Advice?" (Academic paper authored by Celia Gaertig and Joseph Simmons): Gaertig and Simmons find that while research participants generally prefer confident advisors over unsure advisors, this effect does not extend to the advice itself. Research participants actually preferred advisors who gave uncertain advice rather than certain advice, suggesting that advisors try confidently expressing uncertain advice.
Other Helpful Resources...
Rationally Speaking (Podcast by Julia Galef): "Rationally Speaking is the bi-weekly podcast of New York City Skeptics. Host Julia Galef explores the borderlands between reason and nonsense, likely and unlikely, and science and pseudoscience. Any topic is fair game as long as reason can be brought to bear upon it, with both a skeptical eye and a good dose of humor!"
Sound Bites (Podcast and Blog by Melissa Dobbins): "People are hungrier than ever for realistic options that empower them to make healthier choices, while bringing back the enjoyment of food. Through her podcast interviews, social media outreach, and TV segments, Melissa helps people digest food and nutrition information so they can make their own, well-informed decisions based on facts, not fear. As a certified diabetes educator for 20 years and a former supermarket dietitian, Melissa shares evidence-based information and realistic solutions to help people enjoy their food with health in mind."
Thinking Fast and Slow (Book by Daniel Kahneman): "In the international bestseller, Thinking, Fast and Slow, Daniel Kahneman, the renowned psychologist and winner of the Nobel Prize in Economics, takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation―each of these can be understood only by knowing how the two systems shape our judgments and decisions."
Thinking Critically about Nutrition (Today's Dietitian article by Jason Riis, Brandon McFadden, and Karen Collins): "A survey of Today’s Dietitian readers found that failures in critical thinking about nutrition are prevalent among clients. Here’s how behavioral science can help RDs turn the tide so patients can make the best dietary choices to improve their health."