We’ve been hearing for years how TikTok hoovers up data globally and presents it to its parent company in China, and potentially thence to the powers that be. But despite renewed calls today from FCC Commissioner Brendan Carr, the popular app is very unlikely to be outright banned. That doesn’t mean it will be allowed to carry on with impunity, though.
Commissioner Carr’s opinion appeared in an interview with Axios, during which he stated that he doesn’t believe “anything other than a ban” would be sufficient to protect Americans’ data from collection by Chinese companies and authorities. (To be clearhis is him expressing his own position, not the FCC’s; I asked two others at the agency for comment and have not received any response.)
This isn’t the first time Carr has voiced this idea. After BuzzFeed News reported data improprieties implied by leaked internal communications, he wrote in June to Apple and Google calling the app an “unacceptable national security risk” and asking the companies to remove it from their app stores. They didn’t, and now it’s back to the question of federal action — first pondered by the Trump administration, which despite many actions restricting China’s reach in the U.S. never managed to get a lock on TikTok.
The reason for that is pretty simple: it would be political self-sabotage. TikTok is not just a wildly popular app, it’s the liferaft to which a generation that abandoned the noble ships Facebook, Instagram, and soon Twitter have clung for years. And the reason why is that American companies haven’t come close to replicating TikTok’s feat of algorithmic addiction.
TikTok’s success in gluing Gen Z to their phones isn’t necessarily a good or bad thing — that’s a different discussion. Taking as a given its place in the zeitgeist, however, it makes a ban politically risky for multiple reasons.
First, it would be tremendously unpopular. The disaffected-youth vote is supremely important right now, and any President, Senator, or Representative who supports such a ban would be given extreme side-eye by the youth. Already out of touch with technology and the priorities of the younger generation, D.C. would now also be seen as fun police. Whether that would drive voters to the other side or just cause them to not vote, there aren’t any good outcomes. Banning TikTok does not secure votes and that is fatal before you even start thinking about how to do it. (Not to mention it kind of looks like the government intervening to give flailing U.S. social media companies a boost.)
Second, there isn’t a clear path to a ban. The FCC can’t do it (no jurisdiction). Despite the supposed national security threat, the Pentagon can’t do it (ditto). The feds can’t force Apple and Google to do it (First Amendment). Congress won’t do it (see above). An executive order won’t do it (too broad). No judge will do it (no plausible case). All paths to bans are impractical for one reason or another.
Third, any effective ban would be a messy, drawn-out, contested thing with no guarantee of success. Imagine that somehow the government forced Apple and Google to remove TikTok from their stores and remotely wipe or disable it on phones. No one likes that look — the companies look too weak and too strong, letting the feds push them around and then showing off their power to reach out and touch “your” device. An IP-based ban would be easily circumvented but also set another unpleasant censorship precedent that ironically would make the U.S. look a lot more like China. And even should either or both of these be attempted, they’d be opposed in court by not just ByteDance but companies from around the world that don’t want the same thing to happen to them if they get a hit and the government doesn’t like it.
For those reasons and more, an outright ban by law, decision or act of god is a very unlikely thing. But don’t worry: there are other tools in the toolbox.
If you can’t beat ’em, bother ’em
The government may not be able to kick TikTok out of the country, but that doesn’t mean they have to be nice about letting them stay. In fact, it’s probable that they’ll do their best to make it downright unpleasant.
The company and service exists in something of a loophole, regulator-wise, like most social media companies. The addition of Chinese ownership is both a complicator and an opportunity.
It’s more complicated because the U.S. can’t directly affect ByteDance’s policies. On the other hand, as a “foreign adversary,” China’s ascendancy over private industry is a legitimate national security concern and policy can be shaped around that. This involves various more independent agencies that are free to set rules within their remits — the FCC can’t, in this case, make a case. But what about the Commerce Department? Homeland Security? The FTC? For that matter, what about states like California?
Rule-making agencies have a free hand — and like tacit Congressional backing — to extend their own fiefdoms to the edges of TikTok, with national security acting as a catch-all reason. If Commerce adds “connected software applications” to supply chain security rules as it has proposed, suddenly the data coming and going through the app is arguably under its protection. (This would all be shown in various definitions and filings at the time of the rulemaking.)
What if TikTok’s source code, user data, and other important resources were subject to regular audits to make sure they complied with cross-border data supply chain rules? Well, it’s a pain in the neck for ByteDance because it needs to scour its code base to make sure it isn’t giving too much away. Having to prove that it handles data the way it says it does, to the satisfaction of U.S. authorities given free reign to be picky — not pleasant at all. And that’s just from a relatively quick rule change — imagine the FTC getting new authority to audit algorithmic recommendations!
More importantly, it gives the U.S. government a chain to yank should ByteDance not comply. It’s one thing to say we think this company is mishandling U.S. citizens’ data and we’re going to ban it. It’s quite another to say an investigation by auditors found that ByteDance misrepresented its data handling techniques, and if they are not fixed in 90 days they will be in violation of law and removed from app stores.
Neither Apple nor Google wants to remove TikTok from their store, but again, it’s one thing to say the feds asked us and another to say we must comply with the law, it’s out of our hands.
If TikTok has proven itself to be impervious to action by the highest levels of government, but that just means that the job gets passed to a small army of bureaucrats who’d love to be the ones who hogtied this particular greased pig. That’s not a rodeo any company wants to find themselves a part of — American, Chinese, or otherwise.