Editor’s Note: “It’s as though the public feels trapped in a crossfire of competing, entrenched interests. Most people crouch to stay out of the way, while anything resembling a larger common purpose gets lost.” Robert Schmuhl ’70 wrote that a decade ago, along with much else in this Magazine Classic that could be copied and pasted into a story about America in 2020.
Americans are cranky and in a national funk. Weary of partisan acrimony that keeps sharpening political divisions rather than solving problems, people’s faith in Washington plummets while anger builds. Reeling from Wall Street shell games and ethics-be-damned business practices, their confidence in corporations plunges to Depression-Era depths. Even once-revered institutions, widely acknowledged for moral guidance and probity, suffer the ignominy of scandal — with public trust tested to unprecedented degrees.
Where on earth are we heading?
Today almost every large, organized realm of activity that impacts our lives is viewed either skeptically or with scorn. Increasingly, people feel targeted by untamed forces that previously seemed more civilized. How many times over the past month have you tried to remedy a concern over the phone — but ended up thinking you were lost in a maze of pre-recorded prompts several area codes away from a human who might be able to help?
Our embittered national mood shows clearly in sophisticated opinion surveys measuring the public’s trust and confidence in government, business, media, education and religion. The most troubling finding amid the mountain of data is the downward trajectory for the majority of institutions — the dwindling faith Americans hold in key sectors of daily life.
Last spring the Pew Research Center interviewed more than 2,500 adults about the workings of government and discovered just 22 percent of those surveyed trust Washington the majority of the time. This percentage — fewer than a quarter expressing contentment — nearly matches the lowest point for this question, a finding of 17 percent in both the fall of 2008 (when Lehman Brothers made the largest filing for bankruptcy in U.S. history and several banks failed) and the summer of 1994, right before a boiling-mad electorate decided it wanted divided government and dramatically swung its loyalty in both houses of Congress from Democrats to Republicans for the first time in 40 years.
In recent years, dissatisfaction seems to simmer — and then the people try to change what’s happening via the ballot box. Campaigning prior to the 2010 midterm elections was savagely negative, a reflection of a toxic time, but also added fuel for disillusionment among the citizenry.
The desire for change is so strong that anyone in power runs the risk of qualifying for throw-the-bum-out treatment. As a result, an unhappy public that lacks definite ideological moorings swings from one party to the other in hopes of finding a way to dispel the pervasive gloom.
Declining trust is just one dimension of the public’s opinion of government. Pew also found heightening discontent. For the first time in Pew’s polling history, people “angry” with Washington exceeded those who were “basically content.” A sizable majority (56 percent) labeled themselves “frustrated.”
Taking the “angry” and “frustrated” together, more than three-quarters of the respondents in this survey are implicitly saying — to paraphrase Shakespeare — something is rotten in the state of America, especially in Washington. Any number of causes feed these feelings: A sense of perpetual partisan warfare, take-no-prisoners rhetoric leading to questionable action, serious problems festering without bring-us-together solutions.
Moreover, the media serve as accomplices in this continuing melodrama by magnifying the most extreme positions. Moderate voices don’t rise to the level of a whisper in a political-media environment that revels in top-of-the-lung debates emphasizing the sharp edges of conflict.
An evening of flipping between Fox News and MSNBC generates more heat than light, because each channel has decided to pursue a partisan viewpoint in its prime-time programming, with Fox on the right and MSNBC on the left. This business and ratings decision produces political consequences by reinforcing already formed opinions rather than building consensus. CNN, which has tried to steer a middle course, keeps falling behind both of its competitors. Certainty expressed with authority, also the approach of most talk radio, trumps a reasoned exposition of pros and cons.
As a result, perceptions of gridlock, stalemate and dysfunction — over immigration, economic disparity, energy, health care, education and the like — form naturally when any semblance of agreement or compromise appears remote. What makes compelling, spark-filled discussions for the media won’t ever produce a framework for working together. The resulting sense of an endless debating society stokes anger and frustration among the public.
A few months after Pew’s findings about Washington appeared, the Gallup Organization released its annual survey of “Confidence in Institutions.” Bringing up the rear in this assessment of 16 different institutions was Congress at 11 percent, a record lack of regard. Both the presidency and the Supreme Court tied at 36 percent, but each reflected a drop from the previous year. Big business and health maintenance organizations (HMOs) did marginally better than Congress at 19 percent, while the military continued as the highest rated institution at 76 percent, a standing it’s maintained since 1998.
These and countless other similar surveys quantify personal reactions to what’s happening. There’s admiration for those serving in uniform and the sacrifices they endure, but growing hostility directed at government for not solving the nation’s problems and at corporate America for insider machinations appearing to favor the few over the many. It’s as though the public feels trapped in a crossfire of competing, entrenched interests. Most people crouch to stay out of the way, while anything resembling a larger common purpose gets lost.
In recent years, as mammoth financial institutions and business concerns such as giant automakers either imploded or tottered on the brink of survival, the phrase “too big to fail” gained currency in the lingo of economic players and experts. Hearing it, though, those folks facing employment or mortgage difficulties wondered if they were considered “too small to be helped.” As that attitude grew, an us-versus-them discouragement developed, pitting the singular individual against whatever large institution might be involved at the center of an issue.
Last fall during a town-hall discussion of jobs, President Barack Obama said in his opening remarks, “The whole reason I ran was because my life is testimony to the American Dream. And everything that we’ve been doing since I came into office is designed to make sure that that American Dream continues for future generations.”
Later, when the audience had a chance to ask questions, a man spoke for more than himself. “It feels like the American Dream is not attainable to a lot of us,” he said. “And what I’m really hoping to hear from you is several concrete steps that you’re going to make moving forward that will be able to re-ignite my generation. . . . And I really want to know, is the American Dream dead for me?”
In this truth-to-power moment, the stark reality of the nation’s enduring and alluring myth — first enunciated in 1931 during the nightmarish days of the Great Depression — is brought into bold relief. The speaker knows how U.S. life has been and should be, but his own experience tells another story. With the bedrock belief that the future will be better than the past under assault, can’t someone, particularly the nation’s leader, do something when so much is at stake?
Questions about the premise and promise of the American Dream reverberate with greater frequency of late with the economy faltering, jobs more difficult to find, globalization scrambling traditional business practices and Wall Street so distant from Main Street U.S.A. that it seems like another country — with its own foreign language (credit-default swaps, collateralized mortgage-backed securities, derivatives and the like), mysterious mores and “greed-is-good” philosophy.
But the nation’s souring didn’t happen overnight. In 1958, when the University of Michigan’s National Election Study began to ask about trusting the Washington government to do what’s right, solid majorities continually responded “just about always” or “most of the time” until 1974. It might be difficult to fathom now, but throughout the tumultuous 1960s better than 60 percent voiced confidence in what was happening in the nation’s capital.
Slippage to around 50 percent occurred early in the 1970s, with debate over the Vietnam War raging and roiling America. Then, in 1974, after all the nefarious shenanigans lumped under the single word “Watergate” came to light and Richard Nixon became the first president to resign from office, the percentage of trusting Americans descended to 36. By 1980, the number was down to 25 percent.
As president from early 1977 until the beginning of ’81, Jimmy Carter was recipient-in-chief of much of the criticism directed at government. With inflation spiking upward, energy worries and the long-running Iranian hostage situation (among other troubles), he seemed unable to tame the tempests battering the nation he was elected to lead.
Even though the levers of government to affect positive change always seemed beyond his reach, Carter possessed acute understanding of the nation’s mood. In July of 1979, he used a speech from the Oval Office to describe the temper of that time. His words were pointed and, for a politician, somewhat impolitic.
Carter identified “a crisis of confidence” as a fundamental concern, “a crisis that strikes at the very heart and soul and spirit of our national will.” He pointed out “a growing disrespect for government and for churches and for schools, the news media, and other institutions. This is not a message of happiness or reassurance, but it is the truth and it is a warning.”
Some of Carter’s sharpest language cut to the bone. Drawing on his White House experience, he said: “What you see too often in Washington and elsewhere around the country is a system of government that seems incapable of action. You see a Congress twisted and pulled in every direction by hundreds of well-financed and powerful special interests. You see every extreme position defended to the last vote, almost to the last breath by one unyielding group or another. You often see a balanced and fair approach that demands sacrifice, a little sacrifice from everyone, abandoned like an orphan without support and without friends.”
If anything, the political situation, especially in Washington, has become even more citizen-unfriendly — with hyper-partisanship and its equally evil twin, polarization, exacerbating the sense of helplessness for millions who consider themselves, in a civic sense, independent and moderate.
Of course, a direct correlation exists between specific events and people’s opinion of institutions involved in those events. When a member of Congress faces corruption charges, a scandal within an established church makes headlines, or a company implodes like a house of cards, the consequences produce collateral damage with far-reaching effects on how we think and feel. Trust becomes difficult in an environment polluted by misdeeds and malfeasance.
Last April, as more stories about sexual abuse committed by Catholic priests came to light, CNN and Opinion Research Corporation inquired whether the Church had “done a good job or a bad job in dealing with the problem.” Nearly three-quarters of Catholics (74 percent) thought the Church had not dealt well with the situation, only 10 percent fewer than all respondents of the survey question.
Catholicism is by no means alone on today’s metaphorical Via Dolorosa. But what’s often not considered as seriously as it should be is the radical change in how we learn about our central institutions. During the early 1970s, Carl Bernstein and Bob Woodward of The Washington Post were considered heroic by many for their revelations of Watergate, and at the same time CBS News anchor Walter Cronkite was selected as the country’s “most trusted” figure in one survey.
Since then, information technologies have exploded in what’s been called the “big bang” moment of media development. Today more outlets, as mentioned earlier, emphasize opinion-propelled criticism and argument to the point where basic, verifiable facts are elusive in a wintry blizzard of endless debate.
Gone are the days when the philosopher Hannah Arendt could write, “Freedom of opinion is a farce unless factual information is guaranteed and the facts themselves are not in dispute,” or scholar-statesman Daniel Patrick Moynihan would more directly plea, “Everyone is entitled to his own opinion, but not his own facts.” Increasingly, misinformation about a person, policy or program circulates with an ersatz credibility that’s contradictory to what’s actually truthful, leaving people in a quandary about essential building blocks for shaping an opinion.
In this climate, is it any wonder that we see greater reliance on sources with a fixed, ideologically determined viewpoint? Broadcast blowhards of whatever persuasion flourish, even when what they say is constructed on a shaky foundation, because certitude sounds convincing and serves to reinforce an established perspective.
Even though, in theory, the wealth of media choice means citizens can determine their own menu of what to read, see or hear, traditional sources don’t escape public disapproval. In Gallup’s 2010 survey of “Confidence in Institutions,” only 25 percent expressed trust in newspapers and 22 percent in television news.
Whether legitimate or not, charges of media bias influence people’s views of journalism, and — let’s face it — as messengers of so much bad news (economic woes at home, bloody fighting abroad, assorted natural disasters and all the rest) the press and broadcasters pay a price for just telling the story. Ironically, though, at moments of national trauma — like the attacks of September 11, 2001, or Hurricane Katrina in 2005 — you find a much different reaction to those same sources of news coverage.Immediately after 9/11 the media enjoyed a positive rating of 89 percent, a sign that focused treatment of a nationally significant subject can produce a sense of community.
However, a more generalized distrust now serves as the default position for those who deliver the news on a daily basis. Though people can and do turn elsewhere for information they deem relevant in their lives, lingering doubts persist about big-time sources that seem remote and fixated on audience appeal.
To hold an audience in an increasingly competitive environment, the media accentuate conflict, and as a result divisions deepen and public doubt grows. Even Sunday morning public affairs programs on the major networks, such as Meet the Press, Face the Nation and This Week, tend to feature strategically selected guests espousing stark political differences.
A discussion isn’t really a discussion. It’s rhetorical wrestling, complete with verbal half nelsons and high-decibel body slams. As riveting as many of these performances might be, the public over time concludes that problems are being treated as point-scoring moments rather than as difficult issues to be resolved. It’s more difficult to feel any warmth about the future when so many statements we read, see and hear are scripted to be incendiary.
In that climate, is it any wonder we see anger and frustration percolating among the people? And with all the sniping that occurs, should we be surprised when a member of the House of Representatives shouts “You lie!” during a presidential address to Congress, as happened in the fall of 2009?
All the surveys measuring confidence, trust and faith — words used interchangeably in polling — tell only part of the story of contemporary America.
In recent years, academics and writers also have tried to make sense of what the American angst and its causes might mean for the nation and for the years ahead. Notably, Yale historian Paul Kennedy argued in his 1987 study, The Rise and Fall of the Great Powers, that since the 16th century the interaction between government spending and military commitments can be a determining factor in whether a country achieves, and then retains, global influence.
Near the end of his almost 700-page analysis, Kennedy observes that “the United States now runs the risk, so familiar to historians of the rise and fall of previous Great Powers, of what might roughly be called ‘imperial overstretch’: that is to say, decision-makers in Washington must face the awkward and enduring fact that the sum total of the United States’ global interests and obligations is nowadays far larger than the country’s power to defend them all simultaneously.”
As it turned out, the “declinist” thinking the book represented lost some of its pertinence with the fall of the Berlin Wall in 1989, the dissolution of the Soviet Union in 1991 and the economic boom years of the late 1990s, which yielded a surplus in the federal budget. But Kennedy’s cautionary thesis ignited a broader debate about the nation’s trajectory and its role in the world.
Indeed, shortly after the U.S. military started combat operations in Afghanistan and Iraq, bookstores here and abroad devoted burgeoning shelf space to both popular and scholarly works assessing America’s status as an empire and whether the country deserved — or even desired —that status. Some of the titles are revealing in themselves: The Eagle’s Shadow: Why America Fascinates and Infuriates the World; The New Imperialism; The Imperial Tense: Prospects and Problems of American Empire; American Dream, Global Nightmare; and Überpower: The Imperial Temptation of America.
Such books, often written by foreign observers, try to come to terms with America’s standing economically, culturally, militarily and politically. One commentator viewed the United States as “the oblivious empire” — relatively unaware of the country’s reach and influence beyond our shores — while another invoked Freud to describe “an empire in denial.”
Whatever the worth of the arguments, questions and doubts cloud these imperial inquiries in ways similar to the survey research measuring the public’s confidence and trust in institutions. The most original of these broadly gauged examinations is Cullen Murphy’s provocatively titled Are We Rome? The Fall of an Empire and the Fate of America. With careful marshalling of evidence, he draws explicit comparisons between America and the long-ago imperium Edward Gibbon described in his classic 18th century work, The History of the Decline and Fall of the Roman Empire.
Murphy’s probing of parallels — an emphasis on military might, the concentration of power in the capital city, a parochial citizenry without much interest in the wider world — is deft without being deterministic, more on the order of warning-signal similarities we’d do well as a people to consider. But the differences he notes also merit consideration.
“The political gulf between Rome and America is wide, morally and procedurally,” he writes. “America’s democratic form of government looks to us like a flawed and tarnished thing, and we lament its grave deficiencies. But it’s more adaptable, just, and robust than anything Rome came up with in a thousand years. Elections remain a check on power, a crude and clumsy but as yet sacred way to reorient the compass.”
Murphy emphasizes America’s adaptability as an indigenous trait that’s served the nation well in our own and relatively brief existence. Problems might present themselves with troubling regularity, but, however satisfactorily, they tend to receive a rectifying response.
At the end of his comparative study, he observes: “America has lived through more social transformations in a few centuries than Rome did in a millennium. In less than two hundred years America has experienced the end of slavery, the leap from a farm to a high-tech economy, and an influx of alien newcomers whose presence, in percentage terms, is greater in size and proportion than the barbarian influx into Roman lands. We don’t live in Mr. Jefferson’s republic anymore, or in Mr. Lincoln’s, or even in Mr. Eisenhower’s. In America as in Rome, especially in disordered times, there is always the threat of cynicism.”
Since Are We Rome? was published in 2007, what’s called the Great Recession, with its seemingly intractable economic woes and mounting worries about a dysfunctional government, have elevated this “threat of cynicism” and sparked a new round of assessments about America’s future. In a much-discussed essay for Foreign Affairs last spring, Harvard professor Niall Ferguson made the case that a ballooning federal debt could ultimately have a crippling effect on the United States, with far-reaching consequences on domestic programs as well as security obligations internationally.
For Ferguson, America’s spending spree with borrowed money could lead to a tipping point involving “sudden and catastrophic malfunctions” and systemic collapse. Like Paul Kennedy, Ferguson is a transplant to American soil, a British-born and -educated historian, acutely aware of his native homeland’s plunge from imperial heights.
Trenchant as such forebodings might be, is it fair to use someone else’s history as the lodestar for a nation with a vastly different character, philosophy and approach? As Cambridge historian Piers Brendon remarked, “[T]he past is a map, not a compass. It charts human experience, stops at the present and gives no clear sense of direction.”
One other British commentator, Timothy Garton Ash, who divides his time between Oxford and Stanford, provides a more nuanced assessment of the United States. “If you want to feel optimistic about America’s chances of renewal, go to Silicon Valley,” he wrote last September. “For a downer look to Washington. The struggle for America’s recovery is the battle of the iPad against the filibuster.”
The real challenge for the future will come in confronting the political polarization and dysfunction head-on. This will test the country’s capacity for adaptable self-correction and involve a collective decision that problems can no longer stagnate. Shoulder-to-the wheel solutions seem imperative.
Yet, in this regard, surveys about trust in government and related concerns become instructive. While citizens might be frustrated and even angry, 52 percent told Pew in 2010 “the political system can work fine, the members are the problem.” Far fewer, 38 percent, think “most members have good intentions, it’s the political system that is broken.”
In late summer 2010, the Associated Press reported about a poll it jointly conducted with GfK and Roper that revealed 74 percent believe the U.S. Constitution “is an enduring document that remains relevant today.”
If faith in the system itself and its guiding doctrine continues to exist, that means purposeful reforms — in, say, campaign financing, in redistricting on a more democratic basis, or in nominating presidential candidates with a fairer, national process — could reduce some of the distrust and make Washington a less hostile environment as well as a better functioning capital.
Though worthy of criticism on certain grounds, the inchoate Tea Party movement is a flashing sign that the status quo is no longer acceptable to many Americans and that change is being demanded. The anger quantified in opinion surveys is the national kindling that keeps Tea Party proponents aflame.
Largely antigovernment in orientation, this movement is as much a nostalgic throwback to an earlier time as a contemporary political force. With so many aspects of U.S. life under assault, can’t Washington, Tea Partiers ask, find some answers to thwart the decline in the nation’s standards for living and its historic self-image as a country?
The New York Times columnist Thomas L. Friedman is probably not alone in wanting to see the emergence of “a Tea Party of the radical center.” In his proposal, outlined in a column last spring, political independents and moderates would confront Washington’s partisanship and polarization directly to address key policy concerns.
“The radical center is ‘radical’ in its desire for a radical departure from politics as usual,” he wrote. “It advocates: raising taxes to close our budgetary shortfalls, but doing so with a spirit of equity and social justice; guaranteeing that every American is covered by health insurance, but with market reforms to really bring down costs; legally expanding immigration to attract more job-creators to America’s shores; increasing corporate tax credits for research and lowering corporate taxes if companies will move more manufacturing jobs back onshore; investing more in our public schools, while insisting on rising national education standards and greater accountability for teachers, principals and parents; massively investing in clean energy, including nuclear, while allowing more offshore drilling in the transition.”
Such a movement of what Friedman terms “hybrid ideas” would seek consensus or common ground across party or ideological lines, with give-and-take replacing political rigidity and rancor. To succeed, of course, requires leadership from both inside and outside government — leadership that looks beyond the self or a specific group to the more encompassing public at large, what in times past was referred to as the common good.
That same type of leadership is also desperately needed in business, education, religion and media — across the array of American institutions. With it, trust, confidence and faith would, undoubtedly, follow.
Yet, a hard-headed realist might wonder: Is such thinking even possible? Do we dare imagine a program of multi-institutional correction with the objective of national improvement?
Each of the past three presidents devoted sections in major speeches to appeals for “changing the tone in our nation’s capital,” as George W. Bush phrased it in his first speech to a joint session of Congress in 2001. His father, George H.W. Bush, criticized “a chorus of discordant voices” in his 1989 inaugural, while Bill Clinton deplored “the politics of petty bickering and extreme partisanship” in his second inaugural in 1997. But words alone can never produce political harmony, especially when basic civility seems in short supply around Washington and across the airwaves.
It will also take inspired and deliberate action, stressing long-term benefits over near-term, usually next-election results. And, instead of serving as spectators with thumbs pointing perpetually down, the public will need to get involved actively as citizens who are intent on helping to find workable ways to build more confidence and trust.
Exactly what shape these efforts might take is the province of speculation. Especially at a moment of anger and frustration with Washington, will there be greater reliance on individuals and small, related institutions? Will change take place more locally than nationally? Will mavericks play a central role, with their impact changing the direction of established institutions?
From America’s founding, a distrust of bigness has become a defining national characteristic, part of our civic DNA. To keep personal freedom front and center, three branches of government were created to provide checks and balances — and the broad sharing of powers. A century ago, business trusts and monopolies were broken up to reduce corporate omnipotence so as to help both the worker and the consumer. More recently, robust criticism of the major media mushroomed and new sources gained followers, as the people perceived some information outlets too large and dominant.
It’s the American way. But the silver lining of hope amid all the worry and gloomy survey numbers is another deeply rooted national characteristic that can help propel a return to a more positive, forward-looking mood.
At almost the same time last spring that the Pew Center was finding so much hostility directed at the government, Pew in conjunction with Smithsonian magazine conducted a poll about America’s future. Despite all the economic, political and social problems, 64 percent of respondents reported being personally optimistic, and 61 percent remained optimistic about the country. Majorities were also upbeat about the performance of the economy, the improvement of race relations and the affordability of health care over the next four decades.
As commentator Mark Shields ’59 noted in his syndicated column last fall, “An America without optimism is an America lacking confidence either in herself or in her future. . . . For the country’s and the world’s well-being, Americans — who include the leaders and all us followers — must determine to recapture that native optimism and the national sense of confidence it inspires. Unless we can, we risk dooming our own children’s futures.”
Confidence and trust will continue to be tested. Yet, if we draw on our indigenous dynamism and constructively figure out what it takes to create a can-do future more faithful to our history, then maybe all those ruminations about decline and fall could be warnings without warrant.
In other words, it’s up to all of us.
Bob Schmuhl is the Walter H. Annenberg-Edmund P. Joyce Professor Emeritus of American studies and journalism at Notre Dame. He is the author or editor of 15 books, most recently Fifty Years with Father Hesburgh: On and Off the Record and The Glory and the Burden: The American Presidency from FDR to Trump (both published by Notre Dame Press).