Over the course of just 48 hours, Congress moved more aggressively on child online safety than it has in nearly a decade—and a federal court quietly reminded everyone that the entire framework might not survive a constitutional challenge that pits safety against privacy.
On March 4, the App Store Accountability Act passed 26-23 out of the House Energy and Commerce Committee. The following day, the same committee advanced the Kids Internet and Digital Safety (KIDS) Act—a package including the Kids Online Safety Act (KOSA)—to the full House floor in a party-line 28-24 vote. Meanwhile, the Senate simultaneously passed Children’s Online Privacy Protection Act (COPPA) 2.0 unanimously. For the first time in years, both chambers of a divided Congress moved on children’s digital safety in the same week. While the government might be making the rules, social media companies are now struggling to keep up and are walking a fine line between privacy and safety.
The pressure on social media companies to do something about minors on their platforms long predates either act. Meta, Snap, TikTok, and YouTube have faced years of congressional hearings, state attorneys general investigations, and now a mounting wave of personal injury litigation alleging their platforms knowingly exposed children to harmful content and addictive design features.
The industry’s self-regulatory response—age minimums set at 13, parental control settings buried in app menus, terms of service that minors routinely circumvent—has satisfied almost no one. What’s changed in 2026 is that lawmakers have stopped waiting for the industry to find them. Two bills now moving through Congress take fundamentally different approaches to the same crisis, and understanding both requires understanding what each one is actually trying to fix.
Two laws, two approaches
With KOSA and the App Store Accountability Act, the government is trying to attack the problem from opposite ends. KOSA requires companies to conduct risk assessments, restrict default settings on minors’ accounts up to age 17, disclose how their recommendation algorithms work, and give parents meaningful oversight tools. The App Store Accountability Act attempts to stop the problem before a child ever opens an app: requiring age verification at the account level, parental consent for each minor’s download, and the linking of a child’s device to a parent or guardian. Alabama became the fourth state to sign it into law in Feb. 2026, joining Utah, Louisiana, and Texas, with others states looking to put it on the books.
The state laws have added a whole other level of complexity to the issue, especially considering each of the four states’ have various restrictions, meaning no law is the same. Jacqueline Klosek, a partner in global law firm Goodwin’s technology practice, says the overlapping demands are already straining clients. “As a practitioner, I myself am very much challenged by the morass of laws at the state level, and the clients I deal with are also challenged by this. Nobody’s just functioning in one state, and there’s a plethora of laws out there,” she told Fortune. KOSA raising the age of protection to 17 closes off what Klosek calls the industry’s longtime workaround. “There’s no longer going to be this kind of somewhat easy out, and saying, ‘I’ll just focus on users above their teens and not worry so much about this.’ If I’m dealing with minors at all, I have to think more holistically about privacy, security, and safety.”
Critics — including House Democrats — argue the House version is weaker than its Senate counterpart because it strips out a “duty of care” provision that would require companies to design products with children’s safety in mind.
Roman Karachinsky, Chief Product Officer at Incode Technologies—whose social media clients include TikTok—sees the compliance complexity as a symptom of regulation that hasn’t yet caught up to itself. “There’s a lot of regulatory requirements right now that are well intentioned and written in a way that makes sense, but are not prescriptive,” he told Fortune. “Each company kind of needs to figure out, ‘We have this duty of care to verify that our users are not minors, but how do we do that?'”
COPPA, first passed in 1998 (notably before the social media era), requires websites and apps to obtain verifiable parental consent before collecting personal data from children under 13. COPPA 2.0 covers ages 13 to 16, bans targeted advertising to minors entirely, and creates a dedicated FTC enforcement division, closing the loophole that allowed companies to treat teenagers as unprotected users.
In the ongoing battle of collecting underage users’ private data in the name of safety, Karachinsky said with the new version of COPPA, “at least that particular contradiction has been somewhat resolved. You can process data for minors only for the purpose of age verification, as long as you don’t store it, don’t reuse it for any other purpose, and immediately delete it, which I think is all fairly reasonable.” But the global picture remains chaotic. “If you think about a global company that operates in basically every market in the world, the compliance burden that you have to go through to figure out all these different requirements is really high,” he said.
Klosek, who has watched clients navigate this landscape for years, says the frustration is structural. “I think industry, parents, and government all see an issue, a problem—we’re just struggling to identify the best solutions.”













