The TikTok hearing was bad, but that’s how Congress always talks about social media
By Kate Tummarello, Executive Director, Engine Advocacy & Foundation
Congress isn’t thrilled with TikTok, and TikTokers are even less thrilled with Congress. Since last month’s hearing in a key House committee featuring testimony from TikTok CEO Shou Zi Chew, many Internet users are concerned that Congress is considering banning TikTok over national security concerns while appearing confused about what actually happens on the app.
That hearing put Congress’s concerns about the Internet in the spotlight, but these kinds of conversations about social media have been happening on Capitol Hill for years. Soon enough, Congress could change Internet laws in harmful ways without fully understanding what’s at stake. Creators and users are right to be paying attention.
It might make good fodder for impressions and jokes on TikTok, but the danger doesn’t actually lie in lawmakers appearing to bumble through terse questions and split-second opportunities for answers from understandably confused witnesses. For example, one oft-mocked exchange between Chew and Rep. Richard Hudson (R-N.C.) focused on whether TikTok uses Wi-Fi to connect to the Internet. To a trained eye, it seemed like he was trying to ask whether TikTok can — by being on a phone connected to the same Wi-Fi as other devices — access data stored on nearby devices. Given that he represents a major military base, it makes sense that he would worry about whether an app on a civilian phone could access data on a government computer if the civilian phone and government computer use the same Wi-Fi network. To everyone else, it seemed like he didn’t understand how his home Wi-Fi works.
The real danger lies in the clear presumption that nearly all of the members have that TikTok, social media, and the user-generated Internet more broadly is a bad thing. In every conversation about user content online — and the legal framework that enables companies of all sizes to host and moderate that content, more on that later — lawmakers keep coming back to the same handful of horribles: illegal drugs, child predators, self-harm encouragement, terrorist propaganda, etc.
It’s undoubtedly true that harmful content and bad people exist on the Internet and that the unprecedented scale and speed of the Internet give that content and those people a chance for a bigger audience than ever before. Online platforms of all sizes need to be proactive in keeping their corners of the Internet safe and relevant for their users. (Chew said during the hearing that TikTok has 40,000 people working on content moderation. Mark Zuckerberg similarly said during a hearing in recent years that Meta was aiming to have 20,000 people review content. Both companies, and almost everyone else in the business of hosting user content, invest heavily in technological tools to proactively find and remove harmful content. Even startups spend a proportionally large amount of their limited resources on this.)
It’s good that these companies have incentives to moderate the content they host, but lawmakers focusing only on a handful of examples of harmful content miss the bigger picture. During that hearing, no one asked about the small businesses that use Internet platforms to attract customers. No one asked about activists using social media to organize. No one asked about the creators who can build online audiences without the approval of traditional gatekeepers like book publishers and cable TV channels. No one asked about the members of marginalized communities that are able to find and support each other online.
These are the things that are at stake when Congress talks about specific social media platforms like TikTok and anyone else on the Internet that hosts user content. But they’re almost always absent from the increasingly frequent conversations on Capitol Hill, especially those around Section 230, the law that makes it possible for anyone to host user content without having to worry about going bankrupt when caught in the crosshairs any time one person doesn’t like another person’s content (and if you’ve ever been on the Internet, you know how often that can happen).
Because of Section 230, new and innovative platforms are able to get off the ground and host and moderate user content in ways that make the most sense for them and their communities. The result is a world where platforms like YouTube, Instagram, and TikTok operate differently and appeal to users for different reasons (and it’s not just “the algorithm”). More importantly, it’s a world where a company aiming to be the next YouTube, Instagram, and TikTok can launch and continue improving opportunities for sharing content and finding audiences. Changes to Section 230 could mean fewer places for people to gather online and less user content on the Internet.
But that’s not what Congress sees when they look at Internet platforms and the laws that make them possible. They see examples of the most troubling content, and those are what motivates them to do everything from banning TikTok, to amending Section 230, to prohibiting many teenagers from using social media. Instead, they need to be reminded of what should be obvious: that people all over the country and world use the Internet every day to connect, share, learn, entertain, and more, and that sweeping proposals to limit that connecting and sharing will cause more harm than good.
So keep posting about how little Congress understands the Internet, but get engaged with your lawmakers to help them learn and do better. Your ability to post online might just depend on it.
Engine is a non-profit technology policy, research, and advocacy organization that bridges the gap between policymakers and startups. Engine works with government and a community of thousands of high-technology, growth-oriented startups across the nation to support the development of technology entrepreneurship through economic research, policy analysis, and advocacy on local and national issues.