Welcome to the Intermind

Author: Jeremy Manier '92

The Facebook executive strode onto the stage, outfitted like a ’60s-era spy in a black turtleneck and leather jacket, and calmly began describing one of the most mind-blowing research projects in corporate history.

 

What if, she said, you could type words using only your thoughts?

 

After all, fingers are clumsy, especially in this age of ubiquitous mobile devices, when we funnel our ideas into the world through five-inch screens — “little black boxes,” the executive, Regina Dugan, called them.

 

Dugan explained that Facebook is well aware of the unintended isolation that social media and smartphones tend to foster, as users interact with their screens at the expense of in-person relationships. The problem formed part of her team’s motivation to develop technology that could break down social barriers: Even as those miraculous little boxes are midwifing a new kind of society, the transformation comes at a real cost. “It has allowed us to connect to people far away from us, too often at the expense of people sitting right next to us,” she acknowledged at the company’s F8 conference in April 2017.

 

In that brief summary, Dugan diagnosed one of the key shortcomings of 21st century communications technology, and even hinted at a possible cure — weaning people away from the little black boxes. Her proposed solution? Not to back away from all-consuming technology, but rather to accelerate more urgently toward it.

 

Dugan sketched out the future that her team of more than 60 scientists and engineers in Facebook’s mysterious Building 8 lab group were seeking to achieve — the development of consumer devices capable of typing 100 words per minute by brain decoding, the reading of neural signals straight from a user’s brain. The technology likely would rely on optical sensors to detect neural activity through hair, skin and skull, extending techniques currently used in ordinary blood oxygenation monitors. Decoding the mind’s language, she said, promised to amplify the supremacy of mobile technology while rendering it invisible. It could connect people more intimately, giving them the mental tools to “send a quick email without missing the party,” and to be perpetually on social media without forsaking their social surroundings.

 

“That would be crazy amazing,” Dugan marveled. “But it’s only the beginning of what’s possible.”

 

Feature Manier SpotIllustrations by Stuart Bradford

The nature of friendship

That vision of the future warrants the concern of anyone who uses social media — and not because the vision is in imminent danger of coming true. It’s not. Though the hurdles are far more technical than theoretical at this point, they’re real, and it will take many years or even decades to surmount them. And while Facebook’s Building 8 could be the biggest boon for conspiracy theorists since Area 51, “conspiracy theory” supposes that shadowy forces might foist such technology on an unsuspecting public. The reality is more prosaic. If this logical extension of the social-media status quo takes hold, it almost certainly will come about because people choose it, just as most of us have chosen social media’s current tradeoffs: the loss of privacy and mental stillness along with gains in digital exhibitionism and feelings of connectedness. Facebook’s mind-reading ambitions are more than a disquieting blip on the horizon. They’re a reflection of how we live right now.

 

As social media and the black boxes that propagate it come to dominate more of our attention and social activity, they may also be changing how we think about friendship and individual identity. How should friendships evolve over time and distance? What’s the distinction between one mind and the next, or between mental space and the physical world?

 

These are ancient questions, thrust into the increasingly messy technological vat of social media. Within that mess, ironies abound. It’s a public arena that prizes “friends” yet in many ways flouts meaningful definitions of friendship. It’s a social connector that also disconnects, encouraging users to live more exclusively in their own heads, even in a crowded room.

 

For such reasons, it would be naïve to think of social media as just another tool. It’s an environment, and environments have a way of forcing their inhabitants either to adapt or to fall into decline. Signs of potential decline are easy to find. But the prospect remains, however slim, that in adapting we could find more than a way forward — that our social media future might even reveal unexpected beauty and unity.

 

It’s no secret that smartphone culture has changed how people socialize. One decade into its ascendancy, the marriage of smartphones and social media is also re-shaping how friendships live and die. Friendships that form during college, for instance, have always been especially intense, though the majority are short-lived. Social media stretches out such friendships, and it may risk diminishing them.

 

For those of us who attended college before the turn of the 21st century, graduation was a milestone with a feeling of finality — for life plans and relationships alike. During a philosophy class in my last semester at Notre Dame in 1992, Professor David O’Connor made a point about long-term friendships that seemed harsh yet strangely comforting.

 

Most of your friendships here won’t last, he said.

 

The class focused on ancient ideas of happiness, and Dave (I think I can call him Dave now) had framed it as a potentially useful guide as we pondered what to do with our lives. On that particular day, his message was unsparing. In our discussion of Aristotle’s Nicomachean Ethics, he flagged a section describing how friendship is tied to shared activity and an orientation toward common goals. That’s what college is, he observed. After graduation, when our common pursuits were over, most of the friendships we had formed would naturally fade away. Quite the buzzkill, that Aristotle.

 

But Dave’s larger point was reassuring. If Aristotle was right, friendship really is inseparable from the goals that bind us together. We would soon begin fresh chapters in life, with new goals and new friends. One can choose which old friendships to maintain and nurture, but it shouldn’t be surprising or all that saddening when even strong friendships recede into memory. It would be more surprising if they didn’t.

 

These days, of course, friendships don’t really recede, or at least they don’t have to. At last count I had more than 900 Facebook friends, and I’ve tried to include only people I’ve known and cared about at some point, through many phases of life. But lately — no offense to my peeps — I’ve thought of it as something of a “friendship afterlife.” Social media can resurrect friendships that ran their course long ago, and keep them going through the ambient awareness of the news feed. That’s marvelous in one sense, or we wouldn’t use it. Yet it also can feel like the class notes section of an alumni magazine — every single day of our lives. Is it misanthropic to wonder if social media is defined more by zombie friendships than by something real?

 

I feared that question only reflected my own feelings of social awkwardness, which predate social media by a long stretch. So I contacted Dave for the first time in 25 years, and we met over Christmas to discuss whether technology has transformed classic ideas of friendship. He said it has in many ways, including the elimination of the need for friends to be physically present with each other.

 

“Aristotle understood the best kind of friendship as the actualization of a shared project, and for him that actualization was with another present person,” he said as we made our way through omelets at the Morris Inn. “So it’s very concrete, even though his language may sound abstract to our ears.”

 

Using brain activity to type out words is relatively straightforward, and it’s already being tried with some disabled patients.

 

Aristotle’s examples of activities that foster friendship require face-to-face contact — people drinking together, serving in the same phalanx in a war, or doing philosophy, which he understood not as reading someone else’s books but as striving together through extended conversation. His view of how many friendships a person could sustain was similarly concrete, defined by how many people one could work with on shared projects.

 

Real shortcomings limit this understanding of friendship. For example, Aristotle had relatively little time for friendship’s emotional aspects. But applying an Aristotelian lens to social media raises questions that get lost in the everyday buzz of Facebook posts.

 

“To what extent can we extend the range of people with whom we can actualize a shared project, through the modern technologies of self-extension? That’s how Aristotle would be asking the question about social media,” Dave said. “Are we using social media as instruments of self-extension, so that we have that intimacy and density of interaction but with a larger range of people, or are they instruments instead of a kind of self-dissipation? Do you end up just extending yourself too thin?”

 

Such questions likely did not occur at first to the Harvard students who invented Facebook for their classmates. The website is a system optimized for constant affirmation and validation, originally aimed at a college audience in huge need of those things. That heritage continues to shape the network, even now that most of its users are well beyond college age. At worst, Facebook is a vehicle for extending the social fretfulness of our teenage years indefinitely, reducing friendships to those old insecurities. Do they like me? Who thought my comment was clever, or stupid? Are my friends giving me enough attention?

 

The unintended consequences of social media now trouble its pioneers, albeit belatedly. The “Like” button itself became an albatross for its creators, including former project leaders such as Leah Pearlman and Justin Rosenstein. Rosenstein describes Facebook likes as “bright dings of pseudo-pleasure,” contributing to a culture of constant distraction; Pearlman compares trawling for likes to binge-eating a bag of bad potato chips. Former Facebook president Sean Parker has said the network was built around “a vulnerability in human psychology,” and former Facebook engineer Chamath Palihapitiya says the “dopamine-driven feedback loops” of social media are “destroying how society works.” Like many jaded tech innovators, Pearlman and Rosenstein say they’ve taken conscious steps to limit their exposure to social media, mostly for their own emotional well-being.

 

If the people who created social media are worried, maybe the 68 percent of Americans who use Facebook should adjust accordingly.

 

To gauge the concern among my peers, in January I took an unscientific poll of my Notre Dame classmates on Facebook through our group page. Did they feel that social media had deepened their long-term friendships, or did they worry that the bonds were not genuine? Had the platform possibly even harmed some of their friendships?

 

Most of the responses were some version of “all of the above.” Jim Doppke said social media creates “both immediacy and distance,” and while he’s thankful for the ability to stay in touch with a broad group of old friends, he wishes social media “could approximate or represent us all better, without artifice or algorithm.” Angie Buckingham Melton said it can feel like “a substitute for real interaction.”

 

Father Steve Newton, CSC, ’70, the former rector of Sorin Hall, said Facebook was a godsend when he suffered a serious fall that broke his neck, back and wrist. It allowed hundreds of alumni to get in touch and offer support. “I came to love Facebook for being able to deepen many, many friendships, then and since,” he wrote.

 

My former roommate Timothy Deenihan said he likens Facebook to “a global ‘front porch’ level of friendship,” with interactions that range from completely superficial to quite real and deep. He worries less about the quality of engagement than about the disconnection from consequences — the tendency for online interactions to derail because the participants refuse to hear each other.

 

Such negative turns seem to be more likely online than when two people are talking over coffee. One recent study found that people tend to be more dismissive of a perspective they disagree with when they read it as text, and are more receptive when they hear a speaker make points out loud. That may be because the human voice conveys cues that written communication lacks. Aristotle was more on target than he knew; in friendship, there’s no substitute for engaging with a living, present person.

 

Facebook may be coming around to the view that fostering real relationships is the core of its business. In January, the company announced a fundamental change to its news feed concept, moving away from “relevant content” and toward what Facebook CEO Mark Zuckerberg called “meaningful social interactions.” In practice, that means giving greater visibility to posts from friends and family and less to those from businesses, news media and other sources. Zuckerberg said the changes could reduce the amount of time people spend on the network. “But I also expect the time you do spend on Facebook will be more valuable,” he said.

 

Brain extensions

The idea that technology might change what it means to be human is as old as technology itself. In the Phaedrus, Plato relates a tale of two Egyptian gods, Theuth and Thamus, weighing the merits of Theuth’s inventions, particularly the advent of writing. Theuth argues that writing is an elixir that “will make the Egyptians wiser and will improve their memory.” But the king, Thamus, rebukes Theuth, arguing that his invention will have the opposite effect. “For your invention will produce forgetfulness in the souls of those who have learned it, through lack of practice at using their memory,” Thamus says. “You have invented an elixir not of memory, but of reminding. To your students you give an appearance of wisdom, not the reality of it.” Both gods had a point. Writing is an essential extension of memory and cognition, yet it also renders memory a far less valuable commodity than it was in the age of oral traditions.

 

Social media and smartphones represent the latest stage in this evolution. They expand what individual minds can achieve, at the cost of making us that much more dependent on technology. When was the last time you left home for the day without your phone? Did it make you feel not just vulnerable, but somehow less than your full self? A thinking person who gives up the phone altogether has become almost as hard to imagine as someone who chooses not to use writing. The cell phone, like writing, is a more or less permanent appliance of our minds.

 

That’s why Facebook’s thought-to-text project seems like the logical next step. The social network has already staked a figurative claim at the core of our thoughts and friendships. Why not make that connection more physical?

 

Decoding brain activity is a long-term quest for University of California, Berkeley neuroscientist Jack Gallant, who has received funding from Facebook for some of his work. I first interviewed Gallant 10 years ago, when his team discovered how to use brain scans to deduce what photos a person was viewing. In the following years, they applied similar techniques to reconstruct movies based on the brain activity of people who watched them, producing moving images that were ghostly yet surprisingly accurate. That phase of their work culminated in 2016, when the lab published a semantic atlas of brain areas, showing where the brain stores and processes the meanings of specific words and concepts. They also succeeded in using brain scans to decode not just the raw images that subjects were watching, but the high-level concepts that those images evoked, such as “talk,” “animal” or “vehicle.”

 

“The point when I really felt like, ‘Oh wow, this changes everything,’ was when we started decoding high-level information” from the brain, Gallant told me in a recent interview. “It was the semantic content of the movies, the meaning of the movies. And once I saw that, I thought, OK, there are [about] 500 brain areas.” His team could potentially decode the activity in every one of them.

 

E pluribus unum

Compared to deciphering films, using brain activity to type out words is relatively straightforward, and it’s already being tried with some disabled patients, as Regina Dugan described in her Facebook talk. The main obstacles to routine, consumer-grade brain decoding are not theoretical but technological. It requires either the surgical implantation of electrodes on the brain, which provides accurate readings but will never be practical for consumers, or a way to read brain activity from outside the skull, which is currently an error-prone method. In addition, the functional MRI scans that Gallant and others use would be impractical for tracking a brain function like speech in real time, because they measure blood flow and cannot be updated more quickly than every two seconds.

 

Despite the technical obstacles, Gallant said, “I’m a believer that at some point there will be a hat you could wear that will overcome the problem. But that could take anywhere between 10 and 100 years.” If it worked, Facebook could integrate brain decoding with a device like the Oculus Rift, the virtual reality headset that the company acquired in 2014. Gallant believes brain decoding is one of a handful of technology-based shifts that could fundamentally change humanity, along with artificial intelligence, life extension, gene editing and climate change. “Almost nobody is preparing for it,” he added.

 

While the technology brings great risk, it also has the power to improve lives, for some of the reasons Dugan laid out on that San Jose stage. The ability to easily decode brain signals could bring a revolution for people with spinal injuries, blindness, hearing loss, or limb loss, allowing them to control elaborate robotic prostheses or communicate with other people in new ways, such as by transmitting mental images. In the last decade, mind-controlled robotic arms have progressed from animal experiments to the first attempts to integrate such arms into the daily lives of amputees outside the lab.

 

As Dugan pointed out, developing the brain for use as a typewriter might be a crude first stage, since work by Gallant and others indicates it’s possible to glean not just words but their meaning directly from the brain. From an engineering perspective, words are “compressed thoughts,” and language is a loss-plagued algorithm that feebly strives to capture what’s going on in our minds. The challenge is to develop more efficient ways of expressing the brain’s vast, whirring activity, which Dugan estimates as the equivalent of four streaming high-definition movies each second.

 

“One day, you may be able to choose to share your thoughts independent of language,” Dugan said. “English, Spanish or Mandarin — they become the same.”

 

That could foster a social network of an entirely new kind. Philosopher David Chalmers of New York University says technology already allows brain functions such as memory and problem-solving to be offloaded into computers or other devices. Social media and the rise of brain decoding takes that idea further, raising the prospect of what Chalmers calls “the socially extended mind,” in which “other people become extensions of your mind,” and vice versa. Gallant said DARPA, the Pentagon’s Defense Advanced Research Projects Agency, is interested in such technology because it could radically improve situational awareness on the battlefield. If all soldiers knew what the others were thinking, it could cut through the “fog of war” with sobering efficiency.

 

DARPA and others are also looking at technologies that could not only read brain signals but “write” to the brain, producing images and other experiences by stimulating neurons. One candidate technology for that task is called “optogenetics,” in development since 2006, which involves using light to control neurons modified to be light-sensitive. Optogenetics can also gauge the activity of brain cells, and unlike previous brain-scan technology, it allows for precision timing on the scale of milliseconds rather than seconds. If science can somehow perfect the tasks of reading from and writing to the brain, Gallant said, “you could make a collective mind.”

 

A terrible beauty?

That distant vision of profound union may be social media’s most tantalizing and frightening prospect.

 

Social networks provide an imperfect means of breaking down the psychological distance between individuals, and new technology is creeping ever closer to that goal before anyone is ready for it. Even if we were ready, would we want it? There is something disturbing yet beautiful about the idea of removing the barriers between minds. Imagine what it could mean to be so thoroughly understood, beyond the power of language to convey. Would the notion of an isolated mind become as anachronistic as the idea of a mind without writing? Would we start fewer wars, or more? For hundreds of years, philosophers have wondered how anyone can know that other minds are real. Facebook might end up solving that seemingly intractable problem — not by reasoned argument, but through corporate inertia.

 

It all has the air of a science fiction novel. That’s what I thought 10 years ago when I first came across Jack Gallant’s work and started writing a couple of chapters of a novel with the working title of “Intermind,” pondering a future of connected psyches. My project paused, but the technology didn’t. These days, in ways that no one foresaw, the novel is writing itself.

 


Jeremy Manier was a science and medical writer at the Chicago Tribune from 1996 through 2008. He is now assistant vice president for communications at the University of Chicago.