What’s Wrong With Experts?

America may need expertise these days more than ever. So why are we so intent on disregarding the wisdom and advice of authorities smarter than the rest of us?

Author: David M. Shribman

When homeowners contemplate purchasing a refrigerator, they consult expert opinions online or in consumer guides. When patients are given a frightening diagnosis, they search for a specialist with expertise in the disease. When researchers prepare a scholarly article, they submit their work to expert review. And, in Canada, when outdoors enthusiasts want to know what snowshoes to purchase — backcountry or recreational, or even bear paw or beaver tail — they head to the retail chain called Sports Experts.

Despite such routine advice-seeking, expertise is under assault today more than ever. Medical experts extol the indispensability of vaccinations, yet are dismissed as tools of Big Pharma. Environmental experts plead for action to battle climate change, and their opponents say they are captives of ideological zealots. Academic experts are chided for their ivory towers and denigrated for inhabiting a parallel universe where peripheral concerns are made central.

No longer is mastery of the commercial, political and cultural arts widely worshipped, and no longer are the study, ambition, pluck and virtue that create expertise universally valued. Instead, expertise has become an umbrella term in the common mind for a hopelessly ossified establishment struggling to maintain a false superiority over a larger mass of people who possess superior judgment, wisdom and common sense. As a result, “meritocracy” — the subject of a searing critique in a new book, The Meritocracy Trap, written, poignantly, by a Yale professor —has suddenly become a term of opprobrium.

Even though expertise is the means by which we separate fact and truth from emotion and myth, emotion and myth are sometimes far more powerful and enduring than these rivals.

Ironically, these changes come at a time when experts have more learning, more experience and more exposure than ever before. They also come at a time when offensives against experts — and the conspiracy theories this hostility can sometimes spawn — can be transmitted more widely, more quickly and more effectively than ever before. “There’s an ongoing attack on expertise,” Governor Jerry Brown of California told me just before he left office last year.

“We’re resorting to superstition, to a more primitive outlook, instead of relying on expertise.”

And with that attack has come an assault on many of the core values of institutions in the academic firmament, all of which celebrate the acquisition of knowledge, salute the achievement of intellectual distinction and honor the cultivation of excellence — three of the raw materials of expertise. “We are losing sight of the value of knowledge — and we are losing respect for expertise,” says Barbara K. Mistick, president of the National Association of Independent Colleges and Universities. “We’ve decided as a culture that experts are villains. We can only hope that this is temporary. Our civilization depends on it.”


Expertise is built by diligence and discipline. It grows slowly, sometimes bending to the light of mentors, sometimes sprouting in areas not yet cultivated. Sometimes it is built on solid foundations of knowledge, sometimes it emerges from the rubble of antiquated or discredited thought. It is a torch that can light the way forward or make clear the path from a distant past. It defies sentimentality. It becomes visible in quiet corners of life but has its greatest impact in the bustle of the university, the political arena, the courtroom.

Even though expertise is the means by which we separate fact and truth from emotion and myth, emotion and myth are sometimes far more powerful and enduring than these rivals. The imbalance between these forces, intensified during the digitally dominated 2010s, is one of the cultural markers of our time. And a culture that has cultivated disbelief, embraced the iconoclast and democratized mass communication has become a natural breeding ground for anti-expert sentiment.

While expertise is indispensable, respect for it is not inevitable. Nor is expertise recognized by all, or embraced swiftly. As a result, many Americans no longer trust the experts and their expertise. “In the old days, when politicians or business people were in trouble, they’d call in the ‘white coats,’ experts who would come to the rescue,” Toronto Star columnist Chantal Hébert told me. “Our leaders today make a point of saying they’re against the white coats. We have made expertise evidence of ignorance.”


In the course of May 2017 — seven months after Americans went to the polls to engage in the sober quadrennial ritual when the nation contemplates its character and determines its destiny — the two onetime presidential-election combatants, each oblivious to the other’s remarks, delivered twin commencement addresses 600 miles apart. They presented vastly different perspectives on this question. Here is President Donald J. Trump at Liberty University in Lynchburg, Virginia: “A small group of failed voices who think they know everything and understand everyone want to tell everybody else how to live and what to do and how to think.” Here is former Secretary of State Hillary Rodham Clinton at Wellesley College in Massachusetts: “You are graduating at a time when there is a full-fledged assault on truth and reason. People [are] denying science.”

Today almost all experts — scientists who study the environment and its complex interactions — believe climate change is real, imminent and catastrophic. The 2018 National Climate Assessment argued that global warming “creates new risks and exacerbates existing vulnerabilities in communities across the United States, presenting growing challenges to human health and safety, quality of life, and the rate of economic growth.” One often-cited study found that 97 percent of actively publishing climate scientists believe global warming is caused by human activities, prompting NASA to conclude, “Most of the leading science organizations around the world have issued public statements expressing this, including international and U.S. science academies, the United Nations Intergovernmental Panel on Climate Change and a whole host of reputable scientific bodies around the world.”

Shribman Blablaspot
Illustrations by Robert Neubecker

President Trump thinks otherwise. “I don’t believe it,” he said of the 2018 government report produced by more than 300 experts, overseen by 60 members of a federal advisory committee and reviewed by the National Academy of Sciences. This is not a new position for the president. In a tweet he issued in 2012, the Manhattan billionaire made his views clear: “The concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive.”

In a politicized age, climate change has become yet another element of the great American divide. But those who feel climate change is an urgent issue requiring a combination of scientific expertise and political will are troubled that the expert views supporting their position are dismissed with cavalier ease. The climate assessment reflects the consensus of the government the president leads, says Democratic Senator Edward Markey of Massachusetts, who as a congressman headed the House global-warming select committee from 2007 to 2011. “But the president says: ‘Ignore it. They’re just trying to take your jobs away.’ You can’t hire more expertise than all those folks who sat in the front row in high school, knew all the answers and went to Caltech and MIT. And Trump says they don’t know what they’re talking about.”

Despite their failure to convince climate-change deniers, the so-called self-proclaimed experts continue to lean on their intellectual or moral authority in their interchanges with political figures. James Wynn of Carnegie Mellon University examined 67 congressional hearings on climate change between 1985 and 2013 and found repeated appeals to expertise, such as “I have been researching climate change for 40 years” or “I have published over 200 papers on glacial melting” or “I am a recipient of the Nobel Prize in Chemistry.” In that period, the number of appeals to expertise in oral testimony increased by 100 percent, and in prepared statements by 167 percent, a surge Wynn suggests is the result of “increased political pressure and skepticism about climate change.”


Experts are mostly right; otherwise they would not be celebrated for their expertise. But sometimes experts are spectacularly, gloriously, even hilariously, wrong. The expert who stumbles so stands out, much as the airliner that doesn’t land safely — after tens of thousands of flights do — attracts attention. That disparity has been part of the human condition for centuries.

Experts have erred at least since Aristotle, who, citing barnacles forming on the hull of a boat, wrongly theorized that life could be spontaneously generated. In modern times, we might look to the mathematician and astronomer Percival Lowell, that pioneering founder of the Lowell Observatory in Arizona, who theorized that the “canals” on the surface of Mars were an irrigation system laid out by a presumably intelligent, or at least diligent, life-form on the fourth planet.

Experts’ mistakes have had grave human consequences. A particularly disastrous seven-year period, bookended by the nuclear disasters at Three Mile Island in 1979 and Chernobyl in 1986, and punctuated by the 1984 Bhopal industrial gas leak in India, prompted fresh skepticism of science. In each case the assurances of experts — those who said that the nuclear-energy plants were sound, and those who said the cleanup at the site in India could be swift and effective — proved wrong. Indeed, at Bhopal, the environmental and health threats produced by the methyl isocyanate gas that escaped from the Union Carbide pesticide plant continue today.

In the very year of the Bhopal disaster, where nearly 4,000 people died and more than half a million more were exposed to the toxins, the psychologist and political scientist Philip E. Tetlock was startled by how often, during the chill of the Cold War, the views of Soviet and American scientists diverged. So he began a two-decade project to evaluate just how credible these experts were, eventually examining more than 80,000 predictions by acknowledged experts in a wide variety of fields. “There is often a curiously inverse relationship between how well forecasters thought they were doing and how well they did,” he reported in his 2005 book, Expert Political Judgment: How Good Is It? How Can We Know? In short, the experts did horribly. Fully one quarter of their predictions turned out to be wrong.

The elites of today, especially on campus, seem especially vulnerable. While parents compete to get their children into elite colleges, sometimes paying with jail terms along with hundreds of thousands of dollars, the institutions they regard with such approbation are under increasing scrutiny — and increasing skepticism.

Yet, abandoning the expert authority seems misguided. Identifying a dangerous strain in American life, Tom Nichols, a professor at the U.S. Naval War College and author of The Death of Expertise, warns that people are growing more confident in their views and yet are less competent to evaluate the culture in which they live, rendering them sure that their perspectives are more valid than those of experts. As a result, he worries about a culture where professionals become peripheral and cultivated knowledge and experience are systematically devalued.

“I fear,” he wrote in Foreign Affairs in 2017, “we are moving beyond a natural skepticism regarding expert claims to the death of the ideal of expertise itself: a Google-filled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, teachers and students, knowers and wonderers — in other words, between those with achievement in an area and those with none.”

The conservative commentator Peggy Noonan, a onetime Ronald Reagan and George H.W. Bush speechwriter who herself spoke to Notre Dame’s Class of 2019, argues that the combination of the wars in Afghanistan and Iraq and the 2008 economic crash “cratered” the Republican “reputation for economic probity,” and, in her Wall Street Journal column, has broadened her critique beyond Republicans: “Americans have long sort of accepted a kind of deal regarding leadership by various elites and establishments. The agreement was that if the elites more or less play by the rules, protect the integrity of the system, and care about the people, they can have their mansions. But when you begin to perceive that the great and mighty are not necessarily on your side, when they show no particular sense of responsibility to their fellow citizens, all bets are off. The compact is broken.”

Indeed, the compact — we might think of it as a social contract — is now broken.

“The entire apparatus of the federal government,” writes the columnist Barton Swaim, “is run by people with impressive academic degrees and extensive specialized experience — by people who, in 2008, brought the nation to the brink of economic disaster and, over the last half-century, put the government nearly $20 trillion in debt; by people who were allotted billions for the eradication of poverty but failed to do much of anything beyond the creation of a few expensive and inextinguishable government agencies; by people who claimed to know how to lower the cost of medical care but managed to raise it dramatically.”


The word “elite” was first used in its current meaning by the Italian economist Vilfredo Pareto in 1902 and was weaponized half a century later in The Power Elite, an influential book by the Columbia University sociologist C. Wright Mills. This term, too, is now under attack.

There was a time — in fact ours is a time, though only in sports and the arts — when the word was celebrated. Today, even those Americans who are the most skeptical of expertise still want their children to play with an elite soccer or baseball traveling team, or to play music in elite orchestral ensembles, or to enroll in an elite educational institution. But that’s it. Otherwise we flee from the word. The Yale political historian Beverly Gage argues that many of us now approach expertise “with suspicion, seeing it as mere cover for elite interests, bureaucratic self-preservation or partisan agendas.” Tom Nichols explains: “To reject the advice of experts is to assert autonomy, a way for Americans to demonstrate their independence from nefarious elites.”

The elites of today, especially on campus, seem especially vulnerable. While parents compete to get their children into elite colleges, sometimes paying with jail terms along with hundreds of thousands of dollars, the institutions they regard with such approbation are under increasing scrutiny — and increasing skepticism.

Confidence in higher education dropped an astonishing 9 percentage points between 2015 and 2018, according to a Gallup survey of institutional confidence. The reasons, of course, are complex, but surely one is the cost of higher education. The disconnect between a liberal-arts education and immediate job prospects has always been tenuous, but in the past, perhaps because tuition costs were lower, Americans had more tolerance for the old chestnut that education — any education — was valued simply for education’s sake. No longer. Professors and graduate students have ample and legitimate rationales for emphasizing, to cite one example, the primacy of gender in contemporary academic study. But many conservatives and even some liberals view such academic pursuits as preoccupations with the peripheral — almost always leading cocktail-hour conversations or cable-television debates to employ the words “privilege” and “elitism.”

But the conflation of privilege (a status many experts enjoyed in their upbringing), elitism (a status they achieve via their educations, achievements and positions) and their inevitable handmaiden, expertise, has a dangerous effect. It diminishes each at a time when scientific expertise has reached new heights, and just as elitism is being blamed for the wealth gap and a cultural gap that has left huge swaths of the American population feeling ignored and sneered at, dismissed and disenfranchised.

The merging of expertise, elitism, conspiracy theories — and the cult of common sense created by years of folklore celebrating the pioneer experience — have produced a sense of American cultural independence that sometimes takes a peculiar form in the freedom to believe that which is palpably false.

At the heart of this conundrum is the notion that experts are authority figures; they come by their status as experts by being authorities in their fields. But that word — “authority” — also has become toxic in an era of disruption and in an environment where, as the historian Doris Kearns Goodwin told me, “the distrust of authority rubs off on expertise.”

President Trump, who confronted all experts in 2016 by arguing, “The experts are terrible,” also confounded the experts with his 2016 electoral triumph. The irony of the times may be that a man who by any reasonable measure is a charter member of America’s elite — born into wealth, with an undergraduate degree garlanded in ivy from the University of Pennsylvania and possessed of his own jetliner, his own golf club in Scotland, his own Manhattan tower, and, now, his own military and nuclear weapons — is the nation’s principal warrior against elitism and privilege . . . and expertise.

Trump is not a lone presidential warrior against expert opinion. In the first decade of the last century, the government’s chief agricultural chemist concluded that saccharin was a health threat. Theodore Roosevelt objected strongly, telling the scientist, “Anybody who says saccharin is injurious to health is an idiot.” Roosevelt was right. But the current president’s contempt of experts is wide-ranging. He dismisses domestic-agency bureaucratic experts, whom he considers part of the “deep state”; scientific experts, whom he believes are oblivious to how the real world works; and intelligence experts, whom he considers disloyal. He and his first secretary of state moved to pare down the number of experts in the diplomatic service, and Robert C. O’Brien, his latest national security adviser, plans to trim his staff to fewer than 120 policy experts, down from 174, by early 2020.

All this comes despite the fact that an American president today has at hand the best expertise in the world, some of it in residence in Washington, some of it scattered around the country but ready to be consulted at a moment’s notice. But while his predecessors may have dismissed this expert or that expert, the Trump White House stands alone in not being a magnet for the nation’s experts. It is inconceivable that the president would consider playing host to the kind of dinner John F. Kennedy held in 1962, when he convened all the living Nobel Prize winners in the Western Hemisphere and asked an Exeter-educated Harvard professor, Arthur M. Schlesinger Jr., to write his remarks. That speech began: “I want to tell you how welcome you are to the White House.”

Trump stands virtually alone among international leaders as a climate-change skeptic; his reluctance is part of the new climate he brought to Washington, where expertise across the capital is degraded, including expert advice on foreign policy. Even so, expertise in national-security matters remains highly prized — by experts. Just this spring Daniel W. Drezner, professor of international politics at Tufts University’s Fletcher School of Law and Diplomacy, argued in Foreign Affairs that “the barriers to entry for harebrained foreign policy schemes have fallen away as Americans’ trust in experts has eroded.” The impact: America’s allies once comfortably relied on American experts to rein in dangerous ideas, but that fail-safe mechanism no longer exists.


Politics has been a peculiarly dangerous terrain for experts for nearly three-quarters of a century, in part because experts’ political acumen is not always expert.

In 1948, President Harry Truman learned that a Newsweek poll of 50 political correspondents found that these so-called experts believed devoutly that New York Governor Thomas E. Dewey would win the White House. “I know every one of these 50 fellows,” the president told his adviser Clark M. Clifford, who would later accede to expert advice as secretary of defense in the Vietnam years. “There isn’t one of them has enough sense to pound sand in a rat hole.” (Trump might have said much the same thing in the months before the election, when his campaign was declared to be over before the voters had their say.)

Indeed, the inclination of Presidents John F. Kennedy (who followed expert advice to undertake the doomed Bay of Pigs invasion), Lyndon B. Johnson (who followed expert advice to escalate American troop presence in South Vietnam) and George W. Bush (who followed expert advice on Saddam Hussein’s possession of weapons of mass destruction into war in Iraq) are vivid examples where expertise gone awry had tragic consequences. The miscalculations in Southeast Asia led David Halberstam in 1972 to write The Best and the Brightest.

It was no coincidence that Steve Bannon, Trump’s controversial former adviser, required new White House personnel to read the Halberstam book, much the way Kennedy once ordered his national-security team to read Barbara Tuchman’s The Guns of August, about the opening days of World War I. Five years before the outbreak of the Great War, the British journalist and legislator Norman Angell, a member of the Council of the Royal Institute of International Affairs, wrote The Great Illusion, with the very appealing (and ultimately very wrong) thesis that the costs of a European war would be so great that no country would dare engage in it — and that if for some reason military conflict did ensue, it would be brief. Despite the expertise of the author of this theory, and the broad public appeal of his message, the war lasted more than four years and caused as many as 20 million deaths and at least that many injuries. In that war, like all wars, the first casualty was truth. The second was expertise.


Several months ago I traveled to the far reaches of New Hampshire, to the impoverished town of Berlin, where the presidential candidate Elizabeth Warren, a Democratic senator from Massachusetts, was holding an informal session in the city hall. Standing in the council chambers, she went on a bit about the current political atmosphere and then stopped abruptly. “I want to say something that is deeply controversial in Washington but is safe to say here in northern New Hampshire, because we are so far away from Washington: I believe in science.”

That was a remarkable statement, more for the need to make it than for the content of it, because in a way Washington has been a global pathfinder in the cultivation of scientific expertise. The transformation that the Department of Agriculture undertook in the 1880s, when its emphasis moved from distributing seeds to distributing expertise, is a model of government cultivation of expertise. But just across the National Mall from the USDA’s Beaux Arts headquarters sits the Smithsonian National Air and Space Museum, where curator Jennifer Levasseur of the department of space history told me that one of the most persistent questions museum officials receive is whether Americans actually stepped on the moon in 1969. “The evidence we provide is not always persuasive,” she said, “so we have to argue that the moon landing didn’t happen on a sound stage.”

The merging of expertise, elitism, conspiracy theories — and the cult of common sense created by years of folklore celebrating the pioneer experience — have produced a sense of American cultural independence that sometimes takes a peculiar form in the freedom to believe that which is palpably false. “We’re a frontier country, and many people believe common sense is enough,” says Kathryn Sullivan, who was an astronaut on three Space Shuttle missions and the first woman to undertake a spacewalk. “But when you take that and add it to the culture of celebrity and the power of television, you have a dangerous situation.”

That danger, multiplied by social media — the word-of-mouth of the contemporary period — extends even into public health, where over the decades new medical tests and remarkable advancements have made life safer and longer. And yet perhaps the most passionate debate involving medical experts today revolves around the question of vaccination. Last spring New York City declared a public health emergency to address an outbreak of a disease that had been officially eradicated from the U.S. 19 years earlier. In the first half of 2019, 1,157 cases of that ailment, measles, were confirmed in 28 states, the largest number in 27 years.

Qualms about vaccinations have been a part of medical doctrine and debate for decades. A third of a century after Massachusetts made smallpox vaccination mandatory for public schoolchildren, a pamphlet decrying vaccination circulated in Boston. That despite the fact that as early as 1776, General George Washington had required his troops to be inoculated against smallpox.

Without trust, no amount of accumulated knowledge, no amount of sober wisdom, no quantity of peer-reviewed scholarship or op-eds or books means anything.

Federal health data indicates that the percentage of American children under the age of 2 who haven’t been vaccinated has risen by a multiple of four in less than two decades — at a time when the World Health Organization (WHO) estimates put the number of lives saved by vaccines around the globe each year at more than 2 million. Even so, the anti-vaccination movement now is stronger than ever, prompting Lamar Alexander, a Tennessee Republican who is chair of the Senate health committee, to open Capitol Hill hearings last March by saying that vaccines “meet the FDA’s gold standard of safety” and “save lives.”

The crisis is so grave that the WHO has pressured Facebook and Instagram to commit to providing only reliable, expert-driven information online about vaccinations. Even so, internet groups continue to use Facebook to crowdsource for anti-vaccination efforts, and a Centers for Disease Control and Prevention study in October found that the percentage of kindergartners who had not been vaccinated had increased nationwide over the previous two school years. “Vaccine misinformation,” said Dr. Tedros Adhanom Ghebreyesus, director-general of the WHO, “is a major threat to global health that could reverse decades of progress made in tackling preventable diseases.”

Heidi J. Larson, a professor of anthropology, risk and decision science at the London School of Hygiene and Tropical Medicine, worries that the next major outbreak of disease will come not because of a lack of prevention techniques. “Instead, emotional contagion, digitally enabled, could erode trust in vaccines so much as to render them moot,” she said. “The deluge of conflicting information, misinformation and manipulated information on social media should be recognized as a global public-health threat.”

One of the reasons for the anti-vaccination fever may be the proliferation of provably false information on social media, where falsehoods spread faster and more widely than does fact-based information. So concludes a study undertaken by a Massachusetts Institute of Technology team that examined the circulation of 126,000 news items among 3 million Twitter users. A separate study, conducted by the Royal Society for Public Health, found that half of British parents of children younger than 5 had encountered “negative messages” about vaccination on social media. To promote a counteroffensive against misinformation, Larson now directs a Vaccine Confidence Project, with the mission of detecting rumors and early signals of scares about vaccines.

A University of Pittsburgh study found that while personal and religious liberty was a factor in the views of vaccination opponents, so, too, was suspicion of the medical community. “This is a classic example of distrust of experts,” said Brian Primack, who was a senior author of the study before being named dean of the University of Arkansas’ College of Education and Health Professions. “It’s a case where we have information systems in the world that make people who are not experts sound as if they are experts.”


In this age, even academics — the men and women who themselves are experts — are questioning their relationship to, and dependence on, expertise, especially when it comes to projections and predictions. “When we are trying to understand and plan for the future, we need to rely on expert judgment to help us do projections,” said Meagan S. Mauter, an associate professor of civil and environmental engineering at Stanford University who studies climate and energy topics.

One of the tools many researchers employ in their own work is expert elicitation, the process of asking experts for their best guesses about a social or scientific problem. An enormous amount of work has been done in recent years to understand how successful these experts are in estimating, for example, when Arctic sea ice is going to melt. In some of these areas, experts may be no better than the rest of us, and the wisdom of the crowd might be as good, or better.

But experts perform some functions with appreciably greater skill, precisely in those areas where they possess information, intuition and judgment that can help researchers reach credible conclusions. “Done well, expert elicitation can make a valuable contribution to informed decision making,” M. Granger Morgan, a professor of engineering and public policy at Carnegie Mellon University wrote in the May 2014 edition of the Proceedings of the National Academy of Sciences. “Done poorly it can lead to useless or even misleading results that lead decision makers astray, alienate experts, and wrongly discredit the entire approach.”

In the end, expertise — like love, like religion, like friendships and business relationships, like politics — comes down to trust, even as we acknowledge that expertise sometimes only provides the best guess, and all guesses are gambles. Without trust, no amount of accumulated knowledge, no amount of sober wisdom, no quantity of peer-reviewed scholarship or op-eds or books means anything.

It is not a coincidence — indeed it is a great truth — that trust is the building block of democracy and of a civil society. The legislative and executive branches of our republic, to select an important example, depend on trust — the sacred trust that 435 members of the House, 100 members of the Senate, and the president, can represent the public interest. When they don’t — and today many Americans believe they don’t do that fundamental job as well as they should — then trust breaks down, and expertise is endangered, on Capitol Hill, and then perhaps in the laboratory, and then maybe in the counting houses and brokerages, and then almost certainly in the newsroom.

“A modern liberal society is a complex web of trust relations, held together by reports, accounts, records and testimonies,” wrote the University of London sociologist and political economist William Davies, explaining that “when trust sinks beneath a certain point, many people may come to view the entire spectacle of politics and public life as a sham.”

So at the heart of this question — at the heart of the current civic crisis, at the heart of the threat to expertise — is this element, trust, which has to be earned and not simply given.

In rebuilding an edifice of trust, we must recognize that the worship of technology has led to disappointment, for technology hasn’t solved every earthly problem; that in some instances respect for expertise has led to despair, because the multiplication of experts did not stanch the multiplication of society’s ills; and that the populism rampant across Eastern Europe and in some parts of the U.S. is really meant as a brake on the excessive confidence — and perhaps the excessive power — of experts.

And so perhaps that entire edifice of expertise needs to move on its axis. Perhaps expertise needs to be tied more to knowledge — which, while not fixed, at least is discoverable — than to prediction, which is subject to bias, distortions and, ultimately, ignorance of future conditions. Perhaps we should regard expertise as the understanding, the wisdom, that comes from knowing the past and its traditions, from mastering that which is known — a sufficient challenge for any undertaking.

“In this time of fake news and anti-intellectualism, we need experts more than ever,” said Philip J. Hanlon, a mathematician and president of Dartmouth College. “We need them to carry their knowledge to a world in dire need of deep thinking and rationality.” 

The 18th-century statesman Edmund Burke is revered by liberals and conservatives alike, and among his fundamental legacies is the idea that there is virtue in respect for the past — and redemption in its mastery. History is not predictive; we do not learn it to divine the future. It is, instead, analytical; we study the past to understand the present. Those who master the past and understand the present are, in a word, experts.

“If a man is fortunate he will, before he dies, gather up as much as he can of his civilized heritage and transmit it to his children,” the historians Will and Ariel Durant wrote in The Lessons of History, a meditation in large measure on the contributions of Aristotle, Pascal, Plato, Plutarch, Spengler and Thucydides. “And so to his final breath he will be grateful for this inexhaustible legacy, knowing that it is our nourishing mother and our lasting life.” That, applied to our own times, our own crises, our own political struggles, is the beauty of the human condition. That, writ large, is why expertise matters.


David Shribman is a frequent contributor to this magazine. A Pulitzer Prize-winning journalist and professor at McGill University, he retired as executive editor of the Pittsburgh Post-Gazette in 2019.