Truth Decay In Education

Tuesday, February 13, 2018
Image credit: 

The Rand Corporation’s provocative policy brief on “truth decay”—defined as the blurring of the “line between fact and fiction in American public life”—identifies four major sources of this degradation: Changes in how we get information, including the rise of social media and the 24-hour news cycle; cognitive biases such as the human tendency to “seek out information that confirms preexisting beliefs and reject information that challenges those beliefs”; the general polarization of contemporary politics and society; and “competing demands on the educational system.”

As a habitué of the education policy world, that last point got my attention. RAND president Michael Rich and political scientist Jennifer Kavanagh, the authors of the report, suggest that demands and constraints on K-12 schooling have “reduced the emphasis on civic education, media literacy, and critical thinking.” They add: “Without proper training, many students do not learn how to identify disinformation and misleading information, and are susceptible to disseminating it themselves.”

Such training is indeed vital. Kids will believe almost anything unless and until adults help them learn how to distinguish fact from fiction. They can see for themselves that the sky is (usually) blue but they won’t know which hand is right and which is left without explicit instruction; they won’t grasp “trial by jury” until it’s taught to them; and they surely won’t intuit the blurry but crucial line between freedom of speech and shouting “fire” in a crowded theater. 

That truth matters should be obvious, and any obscuring of the boundaries between truth and falsehood, fact and fiction, news and “fake news” should alarm us all. “Where basic facts and well-supported analyses of these facts were once generally accepted,” the RAND duo soberly declares, “disagreement about even objective facts and well-supported analyses has swelled in recent years.”

It’s hard to disagree with that—and impossible not to lament the change. I well recall the much-quoted aphorism of my mentor, the late Daniel P. Moynihan, that “Everyone is entitled to his own opinion, but not to his own facts.”

When there’s no agreement on the facts, we’re left only with opinion—spin, if you will—and that opinion often masquerades as information. That’s what President Trump terms “fake news,” of course, and he’s not entirely wrong—nor is he entirely innocent of perpetrating it: The Washington Post’s “fact checker” reported last month that he had tabulated 2,140 “false or misleading claims” made by Trump during his first year in office.

As would-be suppliers of news fill their pages and our screens with opinion and cater more and more to their own echo-chamber of subscribers and viewers, it becomes ever harder to get the “straight story.”

When it becomes difficult to know what’s real and what’s fantasy, what’s information and what’s opinion, what’s scientific and what’s unproven (and perhaps unprovable), people become both cynical and gullible. Kids worry about a mythical “endangered tree octopus.” Grown-ups get nervous about UFOs. Hoaxes become credible—as Orson Welles famously demonstrated when he terrified the nation with his “War of the Worlds” radio broadcast in 1938. This unmooring of credibility is then compounded by actual errors that yield frightening misinformation, as happened recently in Hawaii when an employee of the state’s Emergency Management Agency sent out a false alarm that a ballistic missile was going to strike the islands. What can one actually believe? Who can one trust?

Yet just as it’s wrong to place all responsibility for blurring fact with fiction on the President and the media, it’s also wrong to blame schools for not doing enough to combat such blurring because of inattention to “critical thinking.” (I concede that they don’t pay nearly enough attention to civics.) A curricular and pedagogical emphasis on critical thinking has been all the rage for years now among K-12 educators, but how that’s been construed and applied—rather than its absence—is what contributes to truth decay.

Critical thinking in the schools can go awry in at least two ways. One is when it replaces knowledge. Actual information. Facts. How often have you heard education savants and practitioners say something like: “In the age of the Internet, we don’t need to supply kids with information. That they can always look up. What we must focus on are their analytic skills, especially their critical thinking.”

The problem there is that once “thinking” gets detached from “knowledge,” the sky becomes the limit as to what one might think and whether it has any foundation in reality. Three decades ago, Diane Ravitch and I wrote in What Do Our 17-Year-Olds Know? that “the power of the facts-versus-concepts dichotomy has grown so great within the social studies field that some professionals now harbor an instinctive distrust of facts per se.” So instead of teaching “about maps and chaps,” as old-fashioned British educators described geography and history, teachers are more inclined to explore themes such as “mankind’s interactions with the environment” and “why the powerful oppress the weak”—and to assign students essays and exam questions with prompts like “describe ways that you may yourself have contributed to global warming, how you feel about it, and what you could do differently to prevent it.” 

Please. That sort of activity calls for plenty of critical thinking but no actual knowledge. Consider, instead, a practice question for the recently revised Advanced Placement exam in U.S. history, which begins by quoting George Washington’s “Farewell Address”:

“[H]istory and experience prove that foreign influence is one of the most baneful foes of republican government. . . . Excessive partiality for one foreign nation and excessive dislike of another cause those whom they actuate to see danger only on one side and serve to veil and even second the arts of influence on the other. . . . The great rule of conduct for us, in regard to foreign nations, is in extending our commercial relations to have with them as little political connection as possible. So far as we have already formed engagements, let them be fulfilled with perfect good faith. Here let us stop. Europe has a set of primary interests which to us have none, or a very remote relation. Hence she must be engaged in frequent controversies, the causes of which are essentially foreign to our concerns.”


The ideas expressed in Washington’s address most strongly influenced which United States foreign policy decision in the twentieth century?

(A) The establishment of the United Nations in 1945

(B) The formation of the NATO alliance between the United States and Western Europe in 1949

(C) The refusal to join the League of Nations in 1919

(D) The oil embargo against Japan in 1941

This question requires critical thinking, too, but it’s based on knowledge—about Washington and about twentieth-century history. It also requires the ability to extract information from a key document.

If educators don’t teach students to acquire, possess, and value knowledge, as well as the ability to analyze and apply it, there’s no way they can teach them to distinguish truth from error. Truth clings to facts like barnacles to a rock.

What’s more, as veteran education thinker and curriculum designer E.D. Hirsch has been writing for years, the vitality and viability of our democracy itself “depends on shared knowledge.” What holds us together as a society and polity are the things we understand in common, a shared body of knowledge, without which it’s impossible to have shared values, principles, and practices, much less informed citizenship. As Hirsch writes: “A lack of knowledge, both civic and general, is the most significant deficit in most American students’ education. For the most part, our students (and teachers) are bright, idealistic, well meaning, and good natured. Many students and teachers are working harder in school than their counterparts did a decade ago. Yet most students still lack basic information that high school and college teachers once took for granted.”

Beyond that, as cognitive scientist Daniel Willingham explains, it’s impossible to teach “critical thinking” in the abstract. One must possess information to think about—and the way one thinks about information in one field (history, say) is almost entirely different from how one thinks about it in another field (chemistry, perhaps). In other words, critical thinking isn’t an abstract, transferrable skill. It’s what psychologists call “domain specific,” which means it’s intimately tied to knowledge.

The second way that our notions of critical thinking go off track was bequeathed to primary-secondary schooling (and many other realms of our society) by postmodernism, particularly as it—over the past several decades—has infected higher education, which is where our educators and education thinkers learned what they are now putting into practice with their K-12 pupils.

Postmodernism, according to a helpful definition provided by PBS, is: “[A] general and wide-ranging term which is applied to literature, art, philosophy, architecture, fiction, and cultural and literary criticism, among others. Postmodernism is largely a reaction to the assumed certainty of scientific, or objective, efforts to explain reality. In essence, it stems from a recognition that reality is not simply mirrored in human understanding of it, but rather, is constructed as the mind tries to understand its own particular and personal reality. . . . In the postmodern understanding, interpretation is everything; reality only comes into being through our interpretations of what the world means to us individually. . . .”

Interpretation is everything.” Is that not also a definition of “truth decay?” And is it not amplified in our schools—despite the valiant efforts of the Common Core State Standards, the Advanced Placement program, and others to push students to seek actual evidence in original texts rather than just saying what they think about something—especially as contemporary liberalism pushes them to embrace what historians call “presentism”? When we adopt a presentist perspective, rather than striving to understand why something happened the way it happened when it did, we judge what happened then by today’s norms, causing us to be guided by our opinions of the past rather than a clear-eyed understanding of the past. Consider, for example, our curricular squeamishness in acknowledging that Christopher Columbus “discovered” America. To be sure, those who were already here didn’t see it that way, but from the perspective of fifteenth-century Europe, it was indeed a newly discovered place.

Presentism merges with political correctness to distort the truth both in K-12 schools and on university campuses, as we see in the recent spate of tearing down statues and renaming buildings and mascots. For example, two schools near Portland, Oregon are losing the name Lynch, never mind that an actual family by that name donated land a century ago in order that schools might be created in places that didn’t have any. What does such a move tell the children who attend those schools about history? About the fact that some people are named Lynch? About the value of charity versus contemporary offense-avoidance?

I’m not suggesting that educators knowingly fill their pupils’ minds with false facts. Rather, I’m suggesting that they’re not supplying nearly enough actual facts—fundamental knowledge—and that this vacuum, matched with an overemphasis on “thinking skills” and refracted through postmodernism’s focus on interpretation, is contributing as much as anything to the truth decay that should gravely worry us all.

We know from decades of National Assessment data that young Americans emerge from school with frighteningly little knowledge of the country’s actual past and of its civic foundations. Hirsch is spot on about that. And there are valiant efforts in some places to combat that problem, as in the feisty Knowledge Matters Campaign and the College Board’s courageous overhaul of its AP courses in history and civics. Though it is growing, however, the knowledge army still consists of just a few platoons while the critical thinkers field many divisions.

I’m not sure what can be done at the macro level, beyond recognizing the challenge and applauding those willing to tackle it. State academic standards, properly framed and applied, would help. A more immediate and vivid, if partial, solution is for concerned parents to choose schools that buck the truth-decay trend—and for policy-makers to enable more such schools to come into existence, through mechanisms like chartering and vouchers, so that more families can make such choices. There are great schools—such as those affiliated with Hirsch’s own Core Knowledge Foundation, and others that use its curriculum—that strive to impart both essential information and critical thinking to their pupils. Creating more such places of learning won’t eradicate the problem of truth decay—too many other forces are advancing it—but it would at least offer a refuge for those who want to shield their children from it, and for educators who know better.