McMorris Rodgers weighs in on Trump’s spat with Twitter, Congress’s role
Rep. Cathy McMorris Rodgers echoed President Donald Trump’s frustration with Twitter after the president signed an executive order May 28 targeting a key legal protection social media companies rely on, but the Spokane Republican said she has concerns about the federal government deciding what they allow on their platforms.
The president’s move was made in apparent retaliation after Twitter, in a first, added fact-checking labels to two of his tweets May 27. The order targets a 1996 law that shields companies from liability for content posted on their platforms. Legal experts say that, although the order may not hold up in court, it may already have dealt a blow to free speech online.
The law, Section 230 of the Communications Decency Act, also lets tech companies decide what content should be removed from their platforms. Trump has long alleged that companies like Facebook and Twitter discriminate against conservatives in their moderation policies.
McMorris Rodgers backed Trump’s criticism of the company.
“I certainly share a lot of the president’s frustrations with Twitter’s policies,” the Spokane Republican said. “They’ve focused on President Trump’s tweets and then they ignore propaganda from authoritarian regimes like the Chinese Communist Party, that goes unchallenged.”
On May 27, after the New York Post pressed Twitter on this point, the company added similar labels to months-old tweets by a Chinese government spokesperson promoting a conspiracy theory that the U.S. military exported COVID-19 to China.
But McMorris Rodgers, a member of the House committee charged with oversight of tech companies, offered a note of caution about the executive order, which directs federal agencies to clarify how Section 230 should apply and to investigate the content moderation policies of social media companies.
“I also believe that we have to be really careful about unintended consequences as we move forward,” she said. “I have a lot of concerns about the federal government expanding its job in a way that they would be deciding what’s OK or what’s true.”
Sen. Ron Wyden, D-Ore., co-authored the Communications Decency Act when he was a congressman in 1996. In a virtual event hosted by the Aspen Institute on June 2, he described Section 230 as both “a sword and a shield.”
“What 230 says is that users, not the website that hosts the content, are the ones responsible for what they post,” he explained. “That’s … the shield. But we also gave the companies a sword so they could take down offensive content.”
That “sword” allows companies like Twitter to remove posts that violate their policies, such as pornography and threats of violence. But what to do when the person violating those policies is the president?
The day after Trump issued his executive order, Twitter labeled another tweet. In it, he warned those protesting police violence across the country that “when the looting starts, the shooting starts,” echoing words uttered in the Civil Rights era by segregationist politician George Wallace and a Florida police chief who in 1967 said “we don’t mind being accused of police brutality.”
Twitter went a step further this time, hiding the tweet unless a user clicks through a warning that reads, “This Tweet violated the Twitter Rules about glorifying violence. However, Twitter has determined that it may be in the public’s interest for the Tweet to remain accessible.”
Yet even Twitter has been cautious, saying it wouldn’t remove multiple tweets in which Trump suggested TV host and former congressman Joe Scarborough murdered a congressional staffer, despite the woman’s widower issuing a plea for the company to act. The conspiracy theory has been widely debunked, but in a statement Twitter said Trump’s insinuations did not violate its policies.
Trump’s executive order already faces a legal challenge after the nonprofit Center for Democracy and Technology filed a lawsuit June 2, alleging the move was unlawful retaliation against Twitter.
But Charles Duan, director of technology and innovation policy at the R Street Institute, a D.C. think tank that advocates for free markets, said the executive order may already have had its desired effect, pressuring companies to give the president’s posts special treatment.
“It’s the effect of basically trying to bully the social media companies into doing things that the president wants,” he said. “They know that the president can now impose large costs of litigation and large attorney fees on companies who say things that the president doesn’t like.”
“I think that’s going to make them somewhat more risk averse. I think that they’re going to be concerned about even doing things that appear to trigger the president,” Duan said.
That pressure seems to have had an effect on Facebook. Last October, CEO Mark Zuckerberg told Congress, “If anyone, including a politician, is saying things that can cause, that is calling for violence or could risk imminent physical harm … we will take that content down.”
But speaking May 28 on CNBC, Zuckerberg said, “I don’t think that Facebook or internet platforms in general should be arbiters of truth.”
At least two Facebook employees resigned in protest of Zuckerberg’s response. At an emergency staff meeting June 2, the Washington Post reported, thousands of Facebook employees voted to ask the CEO, “Can we please change our policies around political free speech?”
Two other social media companies said last week that they would not give special treatment to the president. Snapchat announced it would no longer promote Trump’s account, and a LinkedIn executive told staffers the platform would “restrict the speech” of any leader who violated rules about inciting violence.
Lawmakers have proposed reforms to Section 230 over the years — presumptive Democratic presidential candidate Joe Biden has even called for it to be revoked entirely — but McMorris Rodgers urged patience and said she favored changes through legislation rather than by the executive branch.
“We shouldn’t rush to make drastic changes to Section 230,” she said, “because I think it’s also really important for start-ups and new entrants to be able to have liability protection, and that’s another way to hold Big Tech accountable.”