The Gist: Social Responsibility

Panel discusses online misinformation and policy changes to strengthen democracy

Author: Margaret Fosmoe ’85

Three panelists sit in tan armchairs on a stage in front of a University of Notre Dame backdrop, speaking to an audience.  The panelists, a woman in business attire, a man in a dark jacket and khakis, and a man in a blue sweater, appear to be engaged in a discussion.  The audience is slightly out of focus.
Schirch, Penniman and Gephardt discuss combating ‘cognitive warfare.’ Photography by Michael Caterina.

Dick Gephardt has one major regret from his 28 years serving in Congress.

The Gist Animated V2

That remorse is voting in 1996 in favor of Section 230 of the Communications Decency Act, which protects social media companies and other online platforms from liability for content posted by their users.

“One of my everlasting (points of) guilt will be for voting for Section 230. We let them off the hook. We made them outside the legal system. It was a horrible mistake, and we all are working here to try to figure out how to correct that mistake,” Gephardt said May 28 at Notre Dame.

The Democratic former congressman, who served as a representative from Missouri, including six years as House Majority Leader, is now co-chair of the Council for Responsible Social Media. His comments came during a panel discussion on campus.

The conversation was part of a three-day National Convening on Social Media and Democracy gathering. The event brought together about 60 leaders and scholars to discuss misinformation, technology, social media and democracy — with a goal of collaborating and proposing nonpartisan policy changes to reduce online misinformation.

Gephardt joined in conversation with Lisa Schirch, the Richard G. Starmann Sr. Professor of the Practice of Peace Studies in the Keough School of Global Affairs and a founding member of the Council on Technology and Social Cohesion, a nonprofit that aims to influence the design of technology to strengthen trust and social cohesion. Nick Penniman, chief executive officer and co-founder of Issue One, was the moderator.

The gathering was hosted by Notre Dame’s Democracy Initiative, in partnership with Issue One and the Council for Responsible Social Media (CRSM).

Issue One is a nonprofit, nonpartisan organization with a mission of improving the political system through free and fair elections, technology reform and reducing the influence of big money in politics. Issue One established CRSM in 2022 as a cross-partisan group of leaders focusing on the negative mental, civic and public health impacts of social media.

Social media use has soared in recent years, with 85 percent of U.S. adults saying they use YouTube and 70 percent on Facebook, according to 2024 data from the Pew Research Center.

But there are few regulations to control the use or manipulation of social media. The result is polarization, distrust and the spread of misinformation.

In the aftermath of the campus gathering, participants plan to draft a nonpartisan policy roadmap designed to reduce misinformation, foster healthy civic engagement and strengthen democracy.

Following are selections from the conversation.

 

On the need for regulation of social media companies:

Gephardt: You’ve got to have rules. Let me give you an analogy. What if, before the Super Bowl game, the NFL in February said, “We’ve decided this time to not have referees. May the best team win.” . . . The players would’ve literally killed each other. You would not have had a result. That’s where we are with social media.

 

On social media’s impact on the functioning of democracy:

Schirch: It’s not just that it’s polarizing us. It’s that that in turn has an effect on the ability to make public decisions about climate change or health policies or homelessness. It has this ripple effect, where we sort of corrupt our information environment. If we don't understand what's true and what’s not true, we can’t really make decisions anymore.

 

On the effect of widespread misinformation on young Americans:

Gephardt: If you go take a poll, especially of young people, they will tell you that politics is broken. They want nothing to do with it. (They say) it’s all fixed, wealthy interests run the country, my voice means nothing.

A man in a blue sweater gestures with his hands as he speaks during a panel discussion.

 

On social media platforms using algorithms to push divisive and polarizing content:

Schirch: A mathematical calculation can be made to amplify different kinds of information. It could be putting verified sources at the top. It could be putting what most people agree upon (at the top). . . . It’s a design choice (by the platforms) to amplify — to put at the top of your newsfeed — the most emotionally divisive content that is going to capture our attention and translate to profit for the major social media platforms.

 

On Meta founder and CEO Mark Zuckerberg’s announcement in January that the company (which owns and operates Facebook, Instagram and other platforms) was eliminating fact-checkers in the U.S. and replacing them with “community notes” and removing certain restrictions on posts about topics like immigration and gender:

Penniman: One of the sentences (Zuckerberg said) that just stuck in my head like an arrow was: “It means we’re going to catch less bad stuff.” . . . I did some simple math a while ago about Facebook. They made a hundred billion dollars in profit last year. They could hire 100,000 content moderators and pay each of them $100,000 a year. That would still only be 10 percent of their profit margin.

So here you have him saying, “We’re going to catch less bad stuff,” even though he can afford to catch almost everything he wants to catch. . . . What do we do as a society when you have the executive of one of the most powerful companies in human history admitting this freely?

 

On use of social media for international propaganda:

Schirch: Russia developed this strategy a long time ago of seeding propaganda — disinformation — because they know it collapses democracy. And they’re doing it very successfully. Back in the 2010s, I was in Brussels working with NATO, and NATO had a term for it: cognitive warfare. Using Facebook as a weapon to plant seeds of division within societies. [Russia] did it in Ukraine for a decade before the war. Russia’s doing this, and now many other countries are involved with cyber armies — thousands of people sitting in warehouses making fake Facebook accounts and posting content.

 

On why the practices of social media companies are one of the most pressing issues facing the nation:

Gephardt: This issue of the information culture in the country is the foundational issue of our time because nothing else will get solved unless it gets solved. There’s no silver bullet. There’s no simple, easy way to do it. There’s no one bill or one private effort that’s going to solve it. . . . Working at it and getting it done, to me, is the most important issue in our country and our world right now.


Margaret Fosmoe is an associate editor of this magazine.