America desperately needs new privacy laws

This is it Stepbacka weekly newsletter discussing one major story from the world of technology. For more on the dire state of technical regulations, follow Adi Robertson. Stepback will arrive in our subscribers’ mailboxes at 8:00 a.m. ET. Sign up for a subscription Stepback here.

In 1973, long before the modern digital era, the US Department of Health, Education, and Welfare (HEW) issued a report called “Records, Computers, and the Rights of Citizens.” Networked computers “seemed destined to become the primary medium for creating, storing, and using records about people,” the report’s preface began. These systems could be a “powerful management tool”. But with few legal safeguards, they could infringe on the basic human right to privacy – in particular “an individual’s control over the use of information about him”.

These concerns were not just cheap talk in Washington. In 1974, Congress passed the Privacy Act, which established some of the first rules focused on computer systems of record — limiting when government agencies could share information and outlining what access individuals should have. Over the course of the 20th century, the Privacy Act was joined by other privacy rules for fields including health care, children’s websites, electronic communications, and even videotape rentals. But over the past few decades, amid an explosion of digital surveillance by governments and private companies, Congress has repeatedly failed to keep up.

Lawmakers have considered numerous plans to preserve Americans’ privacy, but they have failed time and time again. Attempts to rein in government spying — such as proposed updates to the Electronic Communications Privacy Act of 1986 — have been stymied by concerns that they would jeopardize police and counterterrorism operations. Despite several concerted attempts by members of both parties, Congress has not passed a bill to regulate how private companies collect data and what rights people have to their own information. Even highly targeted proposals like the Fourth Amendment’s Not for Sale Act — which restricts police from circumventing existing privacy laws by using data brokers — have not cleared the hurdle of becoming law.

Meanwhile, new technologies, from augmented reality glasses to generative artificial intelligence, are creating new risks every day – making it easier than ever to surreptitiously track people or encourage the sharing of intimate information with technology platforms.

Immigration agents are harassing citizens they have identified using data analytics and facial recognition tools. Data breaches at large tech companies are common, and security regulations meant to prevent them are being rolled back. Amazon just aired a Super Bowl ad that boasted how your doorbell could become part of a distributed tracking network to find dogs.

At every point of breach of privacy, it not only risks revealing something intimate about you to the world, it shifts the balance of power towards who has the most data. Consider algorithmic pricing, where companies use personal data about shoppers to set individual prices that they estimate people will pay, resulting in companies like Instacart charging users different prices for the same item. (The company said this was an experiment that has since ended.)

Some privacy risks are addressed by state and international regulations. Companies in Europe have been governed by the General Data Protection Regulation (GDPR) since 2018, although repeal was proposed late last year. Several states have passed some form of general privacy framework as well as more specific rules — for example, Illinois’ privacy law made it easier to file lawsuits against Meta and others, and New York mandated the publication of algorithmic pricing a few months ago. But privacy advocates warn that many of the rules are insufficient. The Electronic Privacy Information Center (EPIC) and the US PIRG Education Fund rated state consumer privacy bills in 2025, and only two states, California and Maryland, earned more than a C.

EPIC Deputy Director Caitriona Fitzgerald said The Verge that Congress has it has recently passed at least one meaningful reform: the Protecting Americans’ Data from Foreign Adversaries Act of 2024, which Fitzgerald calls “the strongest federal privacy law in recent years.” PADFAA prevents data brokers from giving hostile countries access to Americans’ sensitive personal information, and EPIC used it to file a complaint about Google’s real-time bid ad system — which allegedly broadcasts sensitive data indiscriminately.

Overall, though, it’s fair to say the situation isn’t great.

By early 2026, a sense of learned helplessness regarding privacy had taken hold in many places. Companies like Meta are pushing the line that if existing technology raises privacy concerns, it’s unreasonable to complain that new technology makes it worse. According to internal Meta documents, Meta also apparently believes that the Trump administration’s highly public blurring of civil liberties (or what Meta euphemistically refers to as a “dynamic political environment”) will keep activists distracted and leave them free to push invasive features like facial recognition into products.

But the administration’s actions make the dangers of these systems increasingly difficult to ignore. It’s one thing to know the government he could to look up personal information about you. It’s another thing to let ICE agents intimidate you by dropping your name.

Not all of today’s privacy nightmares have simple regulatory solutions. But privacy groups have argued for years that there are obvious ways to start improving the situation. A long-standing wish list from a coalition that includes EPIC, PIRG and others proposes the creation of a new independent federal data protection agency, as well as a private right of action that would allow individuals to sue for violations of privacy laws. One of the latest proposals is the Data Justice Act, a model piece of legislation outlined last month by a group of researchers at NYU Law. It aims to limit the state’s collection and use of our deep digital footprints, with the goal of redefining personal data “not as information that the state can freely access, but as something that is inherently ours.”

With many digital technologies, time probably cannot be turned back – and in many cases, people would not want to. But it’s time for more lawmakers to take the risks these technologies create seriously and decide they’re worth fighting back against.

  • In many ways, governments around the world are actually turning back on privacy, thanks to the rise of online age-gating. In the US, the Supreme Court has already approved age verification for sites with a large amount of adult content. Now, several states have passed laws requiring this for basically every app on your phone, a policy the Supreme Court is likely to consider sometime this year.
  • Virtually every issue in tech regulation is intertwined, so technology monopolies also exacerbate privacy problems by limiting competition and concentrating information in a few places where it can be used. (That’s another issue that Congress has addressed but failed to follow through on.) Laws also don’t work unless the government fairly enforces them, so the Trump administration’s era of gangster tech regulation must end.
  • One of the easiest cries for privacy in recent years is to “ban facial recognition” – it’s not usually allowed to be used by government and law enforcement, but there’s a push to limit its deployment in private and on smart glasses.
Follow topics and authors from this story to see more of these in your personalized homepage feed and receive email updates.


Leave a Comment