At the time it seemed a reasonable proposition. At least to my 9-year-old mind it did. My older brother had told me that, if I stood on the fringed rug in the upstairs hallway, he could yank it out from under me, leaving me standing, undisturbed. What could go wrong?
We had both seen this kind of thing done before, had watched novelty acts perform the tablecloth trick on television, usually to the accompaniment of Khachaturian’s “Sabre Dance.” A deft wrist-flick and the performer would pull the cloth from beneath a fully set dinner table, barely disturbing the plates, glasses and candlesticks. I believed it could be done. I must have believed, or else why would I have been stupid enough to get on the rug in the first place?
And I continued to believe, right up to the instant my brother yanked the rug out from under me. There must have been a cartoonish-comic moment when I realized my error, but by then I was airborne, with only a bare floor to receive me when I landed — hard — on my backside.
I think of that encounter with the hallway floor as personally significant. Maybe it doesn’t qualify as a leap of faith exactly — it was more like a pratfall of faith. But is it possible that my hard landing that afternoon explains, in retrospect, what has been a lifelong reluctance to make up my mind about what I believe and what I don’t? I have been shuttling back and forth between belief and doubt ever since that day, and I’m still not sure where I belong. Am I a skeptic? A believer? A fool? A cynic? Not sure. The 16th-century essayist Montaigne had one of his mottos inscribed, in Middle French, on his library wall: What do I know? I consider it a victory most days if I can keep my feet while trying to figure it out.
Here’s something I believe: I believe that believing comes naturally, to me and to most people. This is, maybe, the best of all reasons to remain doubtful. Belief is a kind of wishful thinking. That sense of the word is found in its deepest, proto-Indo-European root, leubh: to care, desire, love. Believers are not always fools, but belief does have a way of placing believers in a foolish position: Out on a limb, in too deep, head over heels. The believer’s vulnerability is like a lover’s. Belief is not entirely rational — and sometimes it’s completely irrational. Maybe this is why the philosopher Alan Watts made this distinction between belief and faith: “Belief is the insistence that the truth is what you would lief or wish it to be. . . . Faith, on the other hand, is an unreserved opening of the mind to the truth, whatever it may turn out to be.”
Funny, then, that skeptics like to say they operate from an unreservedly open mind, and thus claim the same ground that Watts assigned to the faithful. In his essay, “The Burden of Skepticism,” the scientist Carl Sagan argued for “an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas.” He continued, “Obviously those two modes of thought are in some tension. But if you are able to exercise only one of these modes, whichever one it is, you’re in deep trouble.”
Skepticism may be necessary for the preservation of liberty, but it remains a surprisingly tough sell. Part of the problem is that skeptics too often come across as curmudgeons or buzzkills. They’re always pointing out the errors of someone else’s thought, and that’s never been a great way to make friends.
Taken to its extreme, skepticism verges into nihilism. The ancient Sicilian sophist Gorgias argued that nothing existed, but he also suggested that this was no big problem, because even if anything did exist, its existence would be unknowable. Similarly, in the Metaphysics, Aristotle writes that the skeptic Cratylus had objected to Heraclitus’ famous claim about the impossibility of stepping twice into the same stream. Cratylus said it couldn’t even be done once.
If I had exercised similar skepticism about my big brother’s rug-pulling hypothesis all those years ago, I might have spared myself a hard fall on my back. Is this what Sagan meant by “exquisite balance”?
Some might diagnose my credulity as a symptom of my nationality. “Americans so dearly love to be fooled,” sniffed the French poet Charles Baudelaire. Nor is it only haughty Europeans who have noticed our national gullibility. The University of Chicago historian and Librarian of Congress Daniel Boorstin argued that, from the start, Americans self-selected for readiness to believe. If you were here, his theory went, it was because you had bought the hype, had swallowed someone’s advertising pitch about the plentiful land, the healthful climes, the gold in the streets.
The USA has always been the place to be if you want to believe in something unbelievable. Consider the career of John Cleves Symmes Jr., hero of the War of 1812 and author of the first entirely American theory of the geographical sciences. In an 1818 paper, Symmes argued that “the earth is hollow and habitable within,” not a solid ball, but a series of concentric shells, each containing still more shells, fitted together nesting-doll style, all accessible through holes in the Earth’s outer surface at the north and south poles. Best of all, this hidden world beneath Symmes’ feet could sustain life.
Illustrations by John S. Dykes
Symmes titled his paper “Circular No. 1,” indicating in characteristically loquacious crackpot fashion that he had many, many sequels planned. This debut effort was addressed “TO ALL THE WORLD!” but Symmes’ attention was mostly focused on the legislators and funders who could help his quest: “I ask one hundred brave companions, well equipped, to start from Siberia, in the fall season, with Reindeers and slays, on the ice of the frozen sea; I engage we find warm and rich land, stocked with thrifty vegetables and animals, if not men. . . .”
With his proposal, Symmes attached a certificate attesting to his sanity.
He spent years lobbying Congress for funds to support an expedition of discovery. He took his case directly to the people, too, lecturing endlessly in small towns in New England, New York and Ohio. Most thought Symmes was bonkers, but a few thought his ideas merited consideration. A senator from Kentucky named Richard Johnson (who later served as vice president under Martin Van Buren) took the floor to read aloud Symmes’ proposal, presumably not for laughs. Another early supporter, Jeremiah Reynolds, began his own campaign, continually tweaking the thesis to eliminate the most bizarre elements. (Not an easy task, since when you took away the bizarre stuff, there wasn’t much left.) The idea of polar exploration finally generated funding for the United States Exploring Expedition of 1838-42, to the South Pacific and Antarctica.
Symmes did not survive to see it. He had traveled, lectured and lobbied so relentlessly to spread his hollow-Earth gospel that he destroyed his own health in the process. He died in 1829 at age 48, believing he was on the cusp of a great discovery — if he could only get more people to listen.
If I had been a resident of one of the heartland burgs that Symmes visited on his lecture tours, I probably would’ve been one of the rubes in the front row listening rapt to the pitch. My guess is that those who turned out came for the novelty of it, and also maybe to have a laugh at old Symmes. Or maybe they came, some of them, because they wanted to believe him. Think of it: a whole other world inside our world? How cool would that be? There is a powerful magnetism to some silly ideas, the harmlessly misguided ones that make the world briefly seem a little weirder, a little more wonderful.
My record on these matters is not a proud one. Santa Claus, the Tooth Fairy, the Loch Ness monster, ancient alien visitors — all seemed to me, at one time or another, entirely plausible. In my defense, most of my zaniest credulity came when I was a kid. With age, the naïveté excuse starts to wear thin. Still, I suppose I am something like Lewis Carroll’s White Queen, who brags about how on some days she believes “as many as six impossible things before breakfast.”
I grew up in a golden age of crackpot beliefs, those post-Watergate years when even the most implausible plot, conspiracy or cover-up seemed possible. The era-defining TV show was In Search Of…, featuring the otherworldly former Star Trekker Leonard Nimoy investigating UFOs or the secrets of the Bermuda Triangle or Bigfoot or the alien astronauts who built the pyramids. For me and a whole generation of adolescent would-be believers, such fascination with the paranormal seemed perfectly, well, normal.
Why do we believe the things we believe? Why is one person’s Nessie another person’s unicorn? Arthur Conan Doyle, the creator of the archrational detective Sherlock Holmes, spent his last decades insisting on the existence of garden fairies and other sprites. At the International Cryptozoology Museum in Portland, Maine, visitors can compare and contrast the relative merits of sea serpents and yetis, lake monsters and skunk apes, the Mothman and the Jersey Devil. Columbus wrote about mermaids in his ship’s log. Maybe belief in the supernatural and the fantastic helps us account for the unaccountable. Maybe it is a gesture toward transcendence. Maybe it’s just dumb fun. What Columbus saw was probably a manatee, but don’t mermaids make a better story?
No wonder skeptics seem so crabby. If people are so eager to believe, then skeptics have to maintain eternal vigilance. That’s exhausting. And it’s only getting more difficult, as we seem to be living in the midst of an epistemological free-for-all. It has always been easy for anyone to believe anything, but never has it been easier to find a community of fellow true believers. Consider the internet-fueled anti-vaccine movement, which the World Health Organization has labeled one of the top ten health threats on the planet. The growing resistance to lifesaving vaccines is, according to Professor Heidi Larson of the London School of Hygiene and Tropical Medicine, an “emotional contagion, digitally enabled.”
Or take the 2002 front-page scoop of that esteemed tabloid, Weekly World News: “Abraham Lincoln Was a Woman!” (And the accompanying sidebar: “Was John Wilkes Booth Her Jilted Lover?”) How many readers now believe, based on this bit of journalism, in the reality of Babe-raham Lincoln? Whatever your guess, American history suggests that the actual number is higher.
Anyway, most of us don’t really know what we believe or don’t believe. The determined atheist sitting in business class on the red-eye from Los Angeles might start praying when the oxygen masks drop from overhead. Behavior makes for a much more accurate index of belief than our own self-reports. It is also possible that some of us feel so acutely the need to believe in the unbelievable that we accept self-deception as part of the cost of doing business in the world.
And conveniently, the fantastic is sometimes worthy of belief. For a long time reports of a central African rainforest creature, an odd hybrid ruminant, had been dismissed as legend. Then the okapi, a short giraffe with distinctive striped hindquarters, was discovered living in the Congo. No longer able to claim legend status, the okapi now has to settle for being a mere curiosity. But no more curious than the platypus, with its duckbill and beaver tail; or the solenodon, with its venomous teeth and mammaries on its back; or the echidna, with its porcupine quills and four-headed penis. All seem to have been cobbled together from mismatched parts.
For that matter, can I really say that I’m not similarly compounded, bifurcated between belief and doubt, giddy enthusiasm and snobbish scoffing?
I outgrew my In Search Of… phase, just as I had outgrown my belief in the Tooth Fairy. Some beliefs, for some believers, prove more difficult to ditch. Is it just me, or is it harder now to find anything cute about transparent delusion? Are the stakes higher, the costs of misguided belief more ruinous? “Credulity is a greater evil in the present day than it ever was before,” Bertrand Russell wrote a century ago, reasoning that it was “much easier than it used to be to spread misinformation.” It is a hoot to consider what Russell would have made of the conspiracy-mongering now seeping out of the Dark Web’s most reptilian chatrooms. Or for that matter, what he would have made of the wee-hour Twitter communiqués coming from the Executive Mansion.
The sheer volume of the flood of claims washing over us, all day, every day, causes practical problems. Even if we all agree on the need for respectful debate and free inquiry and open-mindedness, wouldn’t we also stipulate that some beliefs are just unworthy of respectful debate? But which ones, exactly? Well, that depends on whom you ask. What seems obviously true to me seems harmfully ridiculous to my buddy from the other end of the politico-ideological spectrum. (And so he tells me, with some heat, over after-work beers.) So who gets to referee? And, no, simply agreeing to disagree is not really a solution to the problem; it only creates false equivalencies, of the “fine people on both sides” type.
You can find versions of my barroom debates with my buddy playing out all over Twitter, the certain and the sanctimonious lashing out at the self-righteous and narrow-minded. Everyone, it seems, is rigidly set on shredding the social fabric — everyone, that is, except the people who agree with me.
Not long ago, I phoned my brother, wanting to compare my memories with his regarding the long-ago incident of the hallway rug. I was surprised to learn that he had no memories of it. He could neither confirm nor deny my version of events. I suppose this is understandable, given that it was my backside that had hit the floor so hard that day, perhaps making my recollection of the event sharper and more durable than his.
Or is it that my brother didn’t remember the incident because there was nothing to remember? Could I have made the whole thing up, invented the memory? I don’t think so, but it’s intriguing to consider the possibility, precisely because it costs me so little to consider. It would be nice if all our credulity were so inexpensive, if our naïveté never led to harder falls. I think one of the best things about belief is that it trains you so well to doubt. Just don’t ask me to prove it.
Andrew Santella is the author of Soon: An Overdue History of Procrastination, from Leonardo to Darwin to You and Me (HarperCollins/Dey St.) and a frequent contributor to this magazine.