Submission
Privacy Policy
Code of Ethics
Newsletter

Recent Developments in the U.S. Legislature’s Fight Against Tik-Tok

TikTok, owned by the Chinese company ByteDance, has faced scrutiny due to concerns that the Chinese government could access user data. These concerns are compounded by Chinese laws that could compel companies based in China to comply with requests from state intelligence services. TikTok, however, has consistently denied any association with Beijing’s surveillance or propaganda operations.

The Chinese-owned app has faced scrutiny worldwide including in India, France, and Nepal over concerns that user data could be passed on to the Chinese government. India’s Ministry of Information and Technology banned 59 Chinese apps, including TikTok, in 2020, saying they “unauthorisedly steal and surreptitiously transfer users’ data to China”. In February 2023, the European Commission and the European Council banned TikTok from official devices. In March 2023, within the space of just one month, Australia, Belgium, France, Denmark, Latvia, Lithuania, Norway, New Zealand, and the United Kingdom made similar decisions. In May 2023, following advice from the Austrian intelligence services and several ministerial experts, the Austrian federal government also decided to ban the private use and installation of TikTok on the work devices of federal employees.

The main reason given by lawmakers for these moves is the data collection policy of ByteDance, TikTok’s parent company. Regulators are concerned that the company allows the Chinese government to access sensitive data on users and use it for its intelligence operations.

Most of the evidence points to TikTok’s data collection being comparable to other social networking sites with an economic model based on data collection, such as Facebook. However, unlike its US rivals, TikTok is concerned that it could give the Chinese government full access to the data it collects if it so requests. TikTok denies this and is prepared to subject itself to various transparency measures to allay fears about data collection and data flows. For example, a “Transparency and Accountability Centre” has been set up, designed for regulators, academics, and auditors to learn more about the app’s operations and security practices.

Concerns about TikTok were also raised at a Senate hearing in the US, where the company’s CEO Shou Zi Chew defended TikTok’s privacy practices, saying that they are in line with other social media platforms, adding that the app collects less data than its competitors in many cases. Despite this, there is growing talk in the US of banning the app altogether, either as a negative or positive outcome. The U.S. legislature is actively considering measures to potentially ban TikTok, with a focus on national security concerns related to the app’s Chinese ownership. This legislative effort is a bipartisan issue and has gained momentum in recent times.

One of the key pieces of legislation in this regard is the RESTRICT Act. Introduced by a bipartisan group of senators, this bill aims to give the federal government new powers to restrict and potentially ban technologies from China and five other nations identified as U.S. adversaries. The RESTRICT Act, formally known as the “Restricting the Emergence of Security Threats that Risk Information and Communications Technology Act of 2023,” is aimed at addressing national security threats posed by foreign information and communications technologies (ICTs). The Act empowers the Secretary of Commerce to review and potentially prohibit certain transactions between U.S. persons and foreign adversaries in relation to ICT products and services. Although not exclusively targeting TikTok, the app is a primary concern for the bill’s sponsors. Politico reports that with support for the RESTRICT Act now significantly reduced, attention is turning to an alternative law: the Guard Act.

This law aims to provide the Department of Commerce with more authority to ban TikTok and other foreign-based apps. This move comes after previous efforts to regulate the app have faced challenges, including legal concerns and the complexity of addressing national security risks while balancing the massive user base of TikTok in the U.S. The Guard Act is intended to address these concerns more effectively, including First Amendment issues that have previously hindered legislative actions against TikTok. However, there are doubts about whether the Guard Act can achieve the necessary bipartisan support.

The movement to ban TikTok is not only at the federal level. One notable such effort was in the state of Montana. In early December, Federal Judge Donald Molloy declared Montana’s ban on TikTok unconstitutional, a decision underscoring the intricate balance between state legislation and constitutional rights. The ruling marks a critical point in the ongoing discourse on digital rights and state authority. Montana’s attempt to legislate a ban on TikTok, driven by Attorney General Austin Knudsen, was fraught with constitutional challenges from its inception. Labeled as “laughably unconstitutional” by critics, the bill aimed at suppressing specific online content, directly clashing with First Amendment rights. Judge Molloy’s decision hinged on the bill’s inability to pass even intermediate scrutiny. This standard, typically applied to assess laws potentially infringing upon constitutional rights, was not met by the TikTok ban. The judge emphasized that the bill uniquely targeted TikTok and its user-generated content, which falls under the protection of the First Amendment. A central facet of the state’s defense was the argument that the TikTok ban was a consumer protection measure, similar to other laws within the state’s purview. However, Judge Molloy identified a critical flaw in this reasoning: consumer protection laws do not characteristically target specific forms of speech. This distinction became a cornerstone in his ruling, as it deviated from the general application of consumer protection statutes.

The state also attempted to draw parallels with the Arcara v. Cloud Books case, where a bookstore engaged in illegal activities was shut down. This comparison, however, fell short. Judge Molloy pointed out the stark differences: SB 419, unlike the law in Arcara, targeted a specific form of speech and expression.

Further complicating the state’s position was the bill’s failure to meet the criteria of intermediate scrutiny. It did not demonstrate an important state interest, nor was it narrowly tailored to address the specific concerns it purported to target. The law also failed to provide alternative channels for the communication it sought to regulate, effectively burdening more speech than necessary.

The implications of this ruling are far-reaching. Firstly, it reaffirms the importance of First Amendment protections in the digital age, especially when considering state actions that could potentially impinge on these rights. The ruling also serves as a reminder of the limitations of state power, particularly in matters that intersect with federal interests and constitutional guarantees. Moreover, the decision by Judge Molloy reflects the growing judicial recognition of the unique challenges posed by digital platforms and the content they host. The judgment acknowledges the evolving nature of speech and expression in the digital realm, underscoring the need for laws that are not only constitutionally compliant but also cognizant of the complexities of modern communication mediums.

Another U.S. state initiative in Indiana has had a similar outcome. Indiana’s recent lawsuit against TikTok, stemming from allegations of violating child safety laws, has been dismissed in a ruling that echoes the legal challenges faced by states attempting to regulate digital platforms. The lawsuit, initiated by Indiana and based on the premise that TikTok violated child safety laws, was aimed at addressing concerns similar to those in state lawsuits against Meta. However, it fell short in the court, primarily due to jurisdictional issues and the nature of the allegations against TikTok.

Jurisdictionally, the Indiana court found that the state had failed to establish sufficient grounds to claim jurisdiction over TikTok. The crux of the issue lay in the lack of specific targeting of Indiana users by TikTok. The platform’s presence in Indiana, facilitated through third-party app stores, did not amount to TikTok specifically targeting the state. This distinction is crucial in legal terms, as the mere operation of an interactive website accessible in a state does not necessarily subject a company to that state’s jurisdiction. Furthermore, the court’s analysis of the Deceptive Consumer Sales Act (DCSA) revealed another critical flaw in Indiana’s lawsuit. The act of downloading a free app, such as TikTok, did not constitute a consumer transaction under the DCSA, effectively nullifying the state’s claim under this law. This aspect of the ruling highlights the challenges states face when applying traditional consumer protection laws to the unique domain of digital platforms and free apps. The court also scrutinized the nature of the alleged deceptive acts by TikTok. It noted the absence of material misrepresentations that could have influenced Indiana users to download and use the platform. This lack of direct connection between TikTok’s actions and consumer decisions in Indiana was pivotal in the court’s decision to dismiss the case.

The dismissal of Indiana’s lawsuit against TikTok is emblematic of the broader legal and regulatory challenges states face when confronting digital giants. It underscores the complexity of applying traditional legal frameworks to the dynamic and borderless world of digital platforms. As states grapple with these challenges, this ruling serves as a reminder of the importance of clear jurisdictional grounds and the need for a nuanced understanding of digital consumer interactions in legal disputes.

These failed attempts against TikTok not only shed light on the limitations of state-level legal actions against digital platforms but also signal the need for more informed and targeted legal strategies. As digital platforms continue to evolve, so too must the legal approaches used to regulate them, balancing the need for consumer protection with the realities of the digital age. These decisions not only highlight the constitutional pitfalls of such state-level bans but also set a precedent for how similar cases might be approached in the future.


János Tamás Papp JD, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary, and a legal expert at the Department of Online Platforms of the National Media and Infocommunications Authority of Hungary. He has taught civil and constitutional law since 2015 and became a founding member of the Media Law Research Group of the Department of Private Law. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses”.

Print Friendly, PDF & Email