Bipartisan concern over Big Tech’s impact on children could mean better odds of new regulations
WASHINGTON – On March 25, Rep. Cathy McMorris Rodgers faced the CEOs of three of the world’s most powerful tech companies in a virtual hearing.
The leaders of Facebook, Twitter and Alphabet – which owns Google and YouTube – had been called before the House Energy and Commerce Committee to be grilled on an issue the two parties see very differently. In the eyes of Democrats, the companies don’t do enough to stop the spread of dangerous misinformation on their platforms. To Republicans, they abuse their power to remove content to censor conservative voices.
With the two halves of a near-evenly divided Congress pushing in opposite directions, tech executives have had to do little more than endure the occasional hearing. But McMorris Rodgers, the Spokane Republican who leads the panel’s GOP members, chose to focus on a different issue – one that suggests the CEOs may have more to worry about.
“Do you know what convinced me Big Tech is a destructive force?” she said in her opening remarks. “It’s how you’ve abused your power to manipulate and harm our children. Your platforms are my biggest fear as a parent.”
After McMorris Rodgers asked Facebook CEO Mark Zuckerberg – who said he limits his own kids’ screen time – about the impact of his company’s products on children, Democratic Rep. Kim Schrier sent the Spokane Republican a text: “Great job.”
Schrier, who represents a district that spans the Cascades from Wenatchee to the Seattle suburbs, worked as a pediatrician before she entered Congress in 2019. Like McMorris Rodgers, she thinks about the issue as a parent.
“Sam has very restricted screen time,” Schrier said of her 12-year-old son. “And I’ve been having really frank conversations with him about social media, and how this is companies using kids and marketing his information to private companies, and I think he’s getting a sense of the good, the bad and the ugly.”
In an interview, McMorris Rodgers emphasized regulating tech companies is complex and will require a thoughtful approach, in part because new start-up companies rely on the same protections that let established platforms like Alphabet’s YouTube and Facebook’s Instagram operate with limited restrictions. At the same time, she said, inaction is not an option.
“I think that parents need to be better equipped to be able to protect their children on these platforms, and right now it’s daunting as a parent to try to take on Big Tech,” McMorris Rodgers said. “I don’t think it’s really a fair fight for any parent, and I’m not alone in that. I’ve heard from so many other parents that Big Tech is a big concern.”
The March 25 hearing, McMorris Rodgers said, “was a pretty significant rebuke of Big Tech by both Republicans and Democrats, and I believe that we were putting Big Tech on notice that we expect things to change, that business as usual is not acceptable.”
“If we’re going to be successful in holding Big Tech accountable, it’s going to mean Republicans and Democrats will have to work together on making those changes,” she said. “That’s what happened in 1996. It was a bipartisan compromise that led to the current law.”
That current law is a short provision in the Communications Decency Act of 1996, Congress’s first major attempt to regulate online pornography, known as Section 230. The Supreme Court struck down most of the law as unconstitutional in 1997 but left Section 230 standing, a 26-word passage that both shields companies from liability for content on their website and gives them the authority to remove content as they see fit.
“What really struck me, particularly in listening to Rep. McMorris Rodgers during that hearing, was how in a way it’s going back to where Section 230 began,” said Margaret O’Mara, a professor of history at the University of Washington. “It was very much framed in the terms of protecting children.”
O’Mara, who studies the history and political involvement of the tech industry, said Democrats and Republicans will need to put their combined weight behind a policy aimed at protecting young people in order to overcome the growing political influence of Big Tech.
“This is going to be an ongoing process, and it’s going to be a battle with many fronts,” she said. “The tech companies themselves have built formidable lobbying operations. It’s a recognition that regulation is happening, and these companies want to be at the table, shaping what that regulation looks like.”
Tech companies have scaled up their donations to lawmakers, including Northwest Democrats and Republicans alike, amid mounting scrutiny in recent years. Federal agencies and state governments filed antitrust lawsuits against both Google and Facebook late last year.
“Tech often has portrayed itself as outside politics,” O’Mara said. “Throughout its history, it has always lobbied Washington, but now it is lobbying to a degree that it never has before. They’re bringing out the big guns.”
While Congress has struggled to come to a consensus on what reforms to Section 230 could look like, regulating tech companies’ ability to target young people could be an easier legislative lift, said Ariel Fox Johnson, senior counsel for global policy at Common Sense Media, a nonprofit that advocates for online protections for kids and teens.
“I think one of the biggest potential places for bipartisan work and agreement is when we’re talking about protecting kids and teens,” Fox Johnson said. “This has historically been an area of bipartisan agreement, and over the last decade we’ve seen what would often be very strange bedfellows on other issues.”
The bill that may have the best odds of becoming law in the near term, Fox Johnson said, is the CAMRA Act, which would fund research into the effects of media on children.
Sen. Ed Markey, a progressive Democrat from Massachusetts, introduced that legislation in 2019 with several Republicans, including Sen. Josh Hawley of Missouri, who led the formal objection to President Joe Biden’s victory that preceded the Jan. 6 siege of the Capitol.
While much of the conversation around regulating the tech industry has revolved around Section 230, Fox Johnson said lawmakers could take meaningful steps to protecting kids online without touching that law. Markey and Hawley have also introduced a bill that would update the Children’s Online Privacy Protection Act, which Congress passed in 1998.
“Section 230 is a specter that relates to and hangs over all types of these platform accountability discussions,” she said. “Those are all things we can do without even needing to wade into Section 230.”
Schrier said while she shares many of the concerns McMorris Rodgers expressed about the harms of tech platforms, she also sees a positive side of social media.
“For example, if you happen to be a gay teen in a very conservative household, in a very conservative religious group, in a very conservative state, you have a very high risk of suicide, and social media can actually help you connect with a community that’s accepting and give you some hope and comfort,” she said.
While children’s screen time has doubled during the COVID-19 pandemic, according to monitoring company Qustodio, Fox Johnson said the effects of social media on young people are complex, pointing to research Common Sense Media released in March that found social media can both worsen depression and serve as a vital resource for kids and teens who lack support and connections.
“On balance, I would say it’s more harm than good,” Schrier said, “especially because it can become very addictive. It keeps you in this walled garden where you just keep clicking deeper and deeper in.”
If Congress is to rein in the tech giants, it will likely be because lawmakers like McMorris Rodgers and Schrier came together over their shared desire to protect kids online.
“No matter who you are, everyone has kids and grandkids and is worried about the time people are spending online,” Fox Johnson said. “These are areas of universal concern and certainly a real area of bipartisan interest.”