Christopher Gardner and David Inserra
On March 5, the House Energy and Commerce Committee reported Sammy’s Law to the full House. This bill proposes creating a market for third-party safety software providers by mandating that leading social media companies develop mechanisms that give parents and children control over their experiences on social media. The approach has promise. But the devil is in the details.
An Innovative Legislative Approach
This bill takes a fundamentally different approach than other online child safety laws. Sammy’s law is intended to create a new market for social media monitoring tools designed to keep children safe. Notably, this solution does not rely on broad regimes of online age verification that have significant online privacy and safety risks. Instead, the bill prioritizes giving parents and children effective tools to keep themselves safe.
The core mechanism of this bill is a mandate for large social media platform providers to create and maintain an application programming interface (API). An API is a programmable interface that allows two pieces of software to communicate, interact, and exchange data. The creation of these APIs would allow innovators and large social media platforms to compete over their safety settings and services. This reflects the broader principle that parents, not the state, should control how their children use social media.
An Imperfect Execution
Despite its overall promise, the bill’s details raise several potential concerns. On the technical side, the creation of an API—particularly under the tight time constraint of 30 days after regulatory guidance is issued—may introduce new vulnerabilities into the cybersecurity infrastructure of large social media platforms. The bill’s permitted data disclosures also raise multiple privacy concerns. These include how the third-party safety software providers (safety providers) will be able to share information with authorities and their access to other children’s private information and data. The bill’s requirements may also burden the development of this type of safety provider.
Technical Difficulties:
Constructing and maintaining an API is a resource-intensive process. While most large social media companies already maintain APIs for advertisers or developers, this will partially mitigate only the setup costs associated with this bill. These companies will still be required to bear the associated costs of maintaining and protecting the new API. This additional cost should not be dismissed out of hand.
A significant concern is the security risk associated with creating a new API. In 2023, 71% of all internet traffic passed through APIs—a figure that has likely grown as automated online services have expanded. This makes them a major source of exposure for platforms, particularly given “that the average API breach leads to at least 10 times more leaked data than the average security breach,” according to a 2024 Gartner Market Report. In fact, several major historical data leaks have been caused by the abuse or breach of an API.
Cambridge Analytica abused the Facebook API to harvest data from 87 million Facebook users before it shuttered in 2018. Another major breach occurred in 2023, when a T‑Mobile API was breached, revealing the data of 37 million accounts. API nodes present major targets for bad actors. While lawmakers provide social media platforms with liability protections for their API, that still does little to address the underlying security concerns that result from this mandate and its relatively short timeline.
Privacy Problems:
Turning to the broader privacy issues associated with this bill, the permitted disclosures paragraph (§ 4(f)(1) (2025)) of Sammy’s Law raises serious concerns. This section ostensibly carves out a few, narrow instances in which user data can be shared. The intent to balance between enabling parents to effectively protect their children while also protecting most of the child’s privacy and autonomy is admirable. The problem, however, is that several provisions allow or encourage broad disclosures of and access to personal information.
The first concern is the vagueness of subparagraph D (§ 4(f)(1)(D) (2025)), allowing the disclosure of a child’s data to any person “reasonably able to prevent or lessen” a reasonably foreseeable serious and imminent threat “to the health or safety of any individual.” On its face, this seems like an honest attempt to prevent harm to minors. In its application, it will likely lead to overreporting children’s online behavior to law enforcement. Giving safety providers the ability to disclose messages that may lead to harm implicitly imposes on them the legal responsibility to disclose those messages. That obligation exposes safety providers to business-ending liability, including negligence and wrongful death lawsuits.
To avoid such lawsuits, safety providers will take the most conservative approach possible towards compliance with Sammy’s Law. This means that any online behavior that can plausibly be interpreted as a “serious and imminent threat” will get reported, in many cases to law enforcement.
The implications of this are severe. Sammy’s Law attempts to keep children safe online, but its application may make children less safe offline. The bill puts safety providers in a tremendously difficult position that may force them to overreport children to law enforcement. Online messages, particularly text, are notorious for their inability to communicate full context and tone. What may be a simple joke or gallows humor can be read as something far more serious. Even if these reports ultimately lead to nothing, extensive research has shown that a child’s exposure to police is connected with more illegal behavior, less self-control, and a greater likelihood of subsequent arrest. Instead, this bill should recommit to its core principle of parental autonomy and permit these same disclosures only to a parent or legal guardian. This allows those who actually understand the context surrounding a child to intervene if necessary. It also holds the parent, not a software provider, responsible for their child.
A related privacy concern is that safety providers are given access to information beyond that of the child whose account they are authorized to monitor. By definition, social media means communication with other people. If the user of a safety provider receives an encrypted message from another user, the safety provider also has access to that communication from the other user. That other user may have expected that their communications were private, but there was actually a safety provider that was monitoring them. Additionally, if the safety provider finds content that could be construed as dangerous, it might report that person to the authorities. In this manner, a safety provider may make it difficult for children to confide in one another about serious topics for fear of getting reported to other children’s parents or even law enforcement.
Another concern is how the government may weaponize these safety providers to sidestep encryption. Right now, many users’ communications are protected by strong end-to-end encryption. This encryption cannot be broken by the government, and companies cannot give the government access. If safety providers have been granted access, however, the government can gain access to such conversations. Recent trends in government abuse of administrative subpoenas—subpoenas which can be issued without a judge or grand jury order—to silence critics highlight this concern. Access to encrypted communication also amplifies the risks associated with a data breach.
Regulatory Restrictions:
Sammy’s Law creates a series of legal and compliance requirements for third-party safety software providers (safety providers). These include registration with the FTC, restrictions on where data can be geographically located, independent audits as part of the registration process, and an annual independent audit. All of these requirements are tremendously expensive, especially for entrants into what would be a new and risky market.
If this bill wants to create a market for safety providers, then it needs to be careful not to create insurmountable barriers to entry. Parents and children will only reap the benefits of such safety tools if lawmakers allow them to exist. This bill’s current approach risks creating barriers to entry, limiting the number of safety providers, and increasing their costs to consumers.
The FTC’s position as the regulator of this market also opens the potential for abuse. Control over what safety providers are able to reach the market may serve to ensure certain standards are met in all software available to parents. But it also allows the FTC to lean on safety providers to selectively monitor certain viewpoints or ideas. The bill does ensure that providers are given the right to appeal, but this concern nonetheless deserves to be mentioned whenever delegating enforcement to an executive agency.
Conclusion
Sammy’s bill is an innovation in legislation on kids’ online safety. It generally adheres to the principle that parents, not the government, should decide how their children use social media. That said, there are major concerns about when, why, and to whom user data may be disclosed. In addition, the short timelines surrounding the implementation of the mandated application programming interfaces present significant technical and cybersecurity risks for everyone involved.














