Social media is poised to pay a price for President Donald Trump’s supporters’ rampage through the Capitol.
Democratic lawmakers channeled much of their fury at tech companies’ role in the assault on Congress, an attack organized across a plethora of online platforms and livestreamed by rioters who echoed Trump’s baseless charges of a rigged election. And they’re vowing action against dangerous extremism on the internet as they prepare to take full control of Congress and the White House.
Liberal policymakers for years have urged Silicon Valley to stamp out election disinformation and incitements to violence on their platforms. But they said Wednesday’s invasion proves the industry has done far too little.
“This is going to come back and bite ‘em because Congress, in a bipartisan way, is going to come back with a vengeance,” Sen. Mark Warner of Virginia, the top Democrat on the Senate Intelligence Committee, told POLITICO.
The result is that Democrats, whose ire at the tech industry had already dramatically risen since the 2016 elections, are talking about bringing new levels of scrutiny and consequences to the companies, including stepping up efforts to narrow or overhaul liability protections for sites that host violent or dangerous messages.
Democrats also expressed optimism about getting cooperation from Republicans, despite the Trump-era GOP’s frequent focus on allegations that social media platforms practice too much censorship. And some activists said reining in online extremism needs to be a Day One priority for President-elect Joe Biden.
A slew of prominent Democrats rebuked social media companies — ranging from tech giants like Facebook and YouTube to smaller, more free-wheeling platforms like Gab and Parler — for not taking more forceful action against those who organized and executed Wednesday’s pro-Trump rally. What began as a protest, featuring an in-person address by the president, escalated into a full-blown storming of the Capitol building that left four people dead.
“Congress was attacked yesterday by a mob that was radicalized in an echo chamber that Facebook and other big platforms created,” said Democratic Rep. Tom Malinowski of New Jersey, who has criticized the way tech companies amplify potentially harmful content to their users.
The most violent messages before and after Wednesday’s swarming of the Capitol appeared on lesser-known platforms that make little or no effort to moderate their content — including Telegram, Parler and TheDonald.win, a pro-Trump site where people openly cheered the prospect of killing liberals and big tech executives.
But liberal lawmakers took particular aim at the industry’s leading companies, such as Facebook and Twitter, for not kicking Trump off their platforms, despite years of warnings from Democratic leaders, civil rights groups and other advocates that the president’s online rhetoric was inspiring real-world harm.
Those efforts met resistance Thursday from one top Republican, Rep. Cathy McMorris Rodgers of Washington state, who said that “censoring” the president would have “serious free speech consequences that will extend far beyond President Trump’s time in office.”
Facebook and Twitter took unprecedented steps to limit the reach of Trump’s messages after the Capitol riots, imposing temporary locks on his accounts that prevented him from posting.
Facebook CEO Mark Zuckerberg on Thursday announced the platform would block Trump indefinitely, at least until Biden is sworn in, writing that it believes “the risks of allowing the President to continue to use our service during this period are simply too great.”
A Twitter spokesperson said the company’s public interest policy, which can exempt public officials like Trump from tweet removals and suspensions, “ends where we believe the risk of harm is higher and/or more severe.”
YouTube, meanwhile, took down a video in which Trump continued to fan false allegations of a “stolen” election, and it said it would begin imposing suspensions and potentially permanent bans against users who violate policies against posting unsubstantiated election fraud claims.
During the lead-up to the 2020 elections, the major platforms beefed up their policies against misinformation, slapping labels on posts that contained election-related content or misinformation and pointing users instead to authoritative news sources.
But the restrictions and takedowns did little to appease critics of both Trump and the tech companies, who said the moves didn’t go far enough and came far too late.
This week offered a dual boost to prospects for action: Tuesday’s Democratic sweep of Georgia’s Senate seats gives the party unified control of Congress for the first time since 2010. And Wednesday’s violence gives Democrats even more of an impetus to use that power to crack down on online extremism.
“It’s created a greater urgency and a greater willingness, hopefully on both sides of the aisle, to dig in and do the hard work that’s going to take to address this,” said Democratic Rep. Jennifer Wexton of Virginia. “I think it’s going to be a top priority for us in 117th to come up with some sort of plan to address this kind of disinformation.”
“The one thing I hope it’s done for a lot of people is show what happens online is not separate from what happens offline,” said Karen Kornbluh, director of the Digital Innovation and Democracy Initiative at the German Marshall Fund. “We saw people who had organized online, come to Washington with their QAnon beliefs and their other ideas they’ve gotten on social media and act them out in the real world.”
One area several Democrats said was now even more ripe for congressional action: overhauling the tech industry’s legal protections under Section 230 of the Communications Decency Act, the much-debated 1996 law that shields platforms from liability over material their users post.
“Yesterday’s events will renew and focus the need for Congress to reform Big Tech’s privileges and obligations,” said Sen. Richard Blumenthal (D-Conn.). “This begins with reforming Section 230, preventing infringements on fundamental rights, stopping the destructive use of Americans’ private data, and other clear harms.”
Malinowski, who has introduced legislation with Rep. Anna Eshoo (D-Calif.) to revoke those protections in instances where platforms amplify or boost certain harmful content, said the riot at the Capitol “hastens the need” for such changes to Section 230.
Warner said he’s working on his own new proposal to revamp Section 230, a target he’s identified as a top priority for this Congress, and that he expects to have “a number of colleagues” supporting the bill. That could make his measure one of the biggest threats to the legal shield on Capitol Hill — and a more plausible one than Trump’s unsuccessful demands that Congress repeal the statute entirely to punish alleged anti-conservative bias.
Warner, who is poised to helm Senate Intel once Democrats take control of the chamber, also suggested the topic could be a big focus of the panel’s activity this Congress. “We will have much more to say on this,” he said when asked about his potential hearing plans.
While outrage over how social media companies handled the riot and the events leading up to it has been dominated by Democrats, some Republicans believe the events will also propel existing bipartisan efforts to curb harmful content on social media, according to a GOP aide to Rep. Michael McCaul of Texas.
The Texas Republican, who spent years chairing the Homeland Security Committee, plans to reintroduce a version of a bill to create a clearinghouse at the Department of Homeland Security where social media companies could voluntarily report online threats of imminent violence, the staffer said. That hub would then circulate those to appropriate law enforcement authorities.
A fatal shooting that killed 23 people in 2019 in El Paso, Texas, had motivated the year-long negotiation behind the bill, a process involving both social media companies and civil liberties groups. But the McCaul aide said that had the measure been in place, it might have helped authorities detect and ward off the violence at the Capitol this week.
Biden, who scorched Facebook and other social media companies over accusations they willfully enabled disinformation in the 2020 elections, will also face pressure from advocacy groups to scrutinize how companies handle violent and misleading content.
Jonathan Greenblatt, CEO of the Anti-Defamation League, said that in light of Wednesday’s riots, Biden should take immediate action on that front when he’s sworn in Jan. 20.
“I think it’s critical for the new administration on day one to launch a process that examines the rise of extremism and the role of the social media companies in contributing to that,” said Greenblatt, floating that idea that Vice President-elect Kamala Harris could lead the task force.
Kornbluh, who served under Presidents Barack Obama and Bill Clinton, suggested Silicon Valley tech companies may in fact welcome the chance to make their own reset as Washington’s power dynamics shift. “To turn the page,” she said, “and adopt some of these pro-transparency policies that would crack down on disinformation and dangerous conspiracy theories and harassment.”
But Wexton, the Virginia Democrat, said companies may not have much of a choice in the matter.
“They can be on the train or they can be under it,” she said.