The State of Consumer Data Privacy Laws in the US (And Why It Matters)

Posted on November 3rd, 2022

With more of the things people buy being internet-connected, more of our reviews and recommendations at Wirecutter are including lengthy sections detailing the privacy and security features of such products, everything from smart thermostats to fitness trackers. As the data these devices collect is sold and shared—and hacked—deciding what risks you’re comfortable with is a necessary part of making an informed choice. And those risks vary widely, in part because there’s no single, comprehensive federal law regulating how most companies collect, store, or share customer data.

Most of the data economy underpinning common products and services is invisible to shoppers. As your data gets passed around between countless third parties, there aren’t just more companies profiting from your data, but also more possibilities for your data to be leaked or breached in a way that causes real harm. In just the past year, we’ve seen a news outlet use pseudonymous app data, allegedly leaked from an advertiser associated with the dating app Grindr, to out a priest. We’ve read about the US government buying location data from a prayer app. Researchers have found opioid-addiction treatment apps sharing sensitive data. And T-Mobile recently suffered a data breach that affected at least 40 million people, some who had never even had a T-Mobile account.

“We have these companies that are amassing just gigantic amounts of data about each and every one of us, all day, every day,” said Kate Ruane, senior legislative counsel for the First Amendment and consumer privacy at the American Civil Liberties Union. Ruane also pointed out how data ends up being used in surprising ways—intentionally or not—such as in targeting ads or adjusting interest rates based on race. “Your data is being taken and it is being used in ways that are harmful.”

Consumer data privacy laws can give individuals rights to control their data, but if poorly implemented such laws could also maintain the status quo. “We can stop it,” Ruane continued. “We can create a better internet, a better world, that is more privacy protective.”

What current national privacy laws (don’t) do

Currently, privacy laws are a cluttered mess of different sectoral rules. “Historically, in the US we have a bunch of disparate federal [and state] laws,” said Amie Stepanovich, executive director at the Silicon Flatirons Center at Colorado Law. “[These] either look at specific types of data, like credit data or health information,” Stepanovich said, “or look at specific populations like children, and regulate within those realms.”

The United States doesn’t have a singular law that covers the privacy of all types of data. Instead, it has a mix of laws that go by acronyms like HIPAA, FCRA, FERPA, GLBA, ECPA, COPPA, and VPPA.

The data collected by the vast majority of products people use every day isn’t regulated. Since there are no federal privacy laws regulating many companies, they’re pretty much free to do what they want with the data, unless a state has its own data privacy law (more on that below).

In most states, companies can use, share, or sell any data they collect about you without notifying you that they’re doing so.

No national law standardizes when (or if) a company must notify you if your data is breached or exposed to unauthorized parties.

If a company shares your data, including sensitive information such as your health or location, with third parties (like data brokers), those third parties can further sell it or share it without notifying you.

“Most people believe they’re protected, until they’re not,” said Ashkan Soltani, an independent researcher and former chief technologist at the Federal Trade Commission. “Sadly, because this ecosystem is primarily hidden from view and not transparent, consumers aren’t able to see and understand the flow of information.”

Europe’s comprehensive privacy law, General Data Protection Regulation (GDPR), requires companies to ask for some permissions to share data and gives individuals rights to access, delete, or control the use of that data. The United States, in contrast, doesn’t have a singular law that covers the privacy of all types of data. Instead, it has a mix of laws that go by acronyms like HIPAA, FCRA, FERPA, GLBA, ECPA, COPPA, and VPPA, designed to target only specific types of data in special (often outdated) circumstances.

The Health Insurance Portability and Accountability Act (HIPAA) has little to do with privacy and covers only communication between you and “covered entities,” which include doctors, hospitals, pharmacies, insurers, and other similar businesses. People tend to think HIPAA covers all health data, but it doesn’t. Your Fitbit data isn’t protected, for example, nor does the law restrict who can ask for your COVID-19 vaccination status.

The Fair Credit Reporting Act (FCRA) covers information in your credit report. It limits who is allowed to see a credit report, what the credit bureaus can collect, and how information is obtained.

The Family Educational Rights and Privacy Act (FERPA) details who can request student education records. This includes giving parents, eligible students, and other schools the right to inspect education records maintained by a school.

The Gramm-Leach-Bliley Act (GLBA) requires consumer financial products, such as loan services or investment-advice services, to explain how they share data, as well as the customer’s right to opt out. The law doesn’t restrict how companies use the data they collect, as long as they disclose such usage beforehand. It does at least attempt to put guardrails on the security of some personal data.

The Electronic Communications Privacy Act (ECPA) restricts government wiretaps on telephone calls and other electronic signals (though the USA Patriot Act redefined much of this). It also sets broad rules concerning how employers can monitor employee communications. Critics often point out that ECPA, which was passed in 1986, is outdated. Since ECPA was written well before the modern internet, it doesn’t protect against modern surveillance tactics such as law enforcement access of older data stored on servers, in cloud storage documents, and in search queries.

The Children’s Online Privacy Protection Rule (COPPA) imposes certain limits on a company’s data collection for children under 13 years old.

The Video Privacy Protection Act (VPPA) prevents the disclosure of VHS rental records. This law might sound silly now, but it came about after a journalist pulled the video-rental history of Supreme Court nominee Robert Bork. VPPA hasn’t held against streaming companies, though.

The Federal Trade Commission Act (FTC Act) empowers the FTC to go after an app or website that violates its own privacy policy. The FTC can also investigate violations of marketing language related to privacy, as it did when it issued a complaint against Zoom for deceiving users by saying video chats were end-to-end encrypted. Some groups have also recently called on the FTC to expand that power to abusive data practices.

With the wide range of different laws, it’s easy to see how people get confused about what rights they do and don’t have. To add to that, alongside these federal laws are a handful of state laws, as well.

Currently, three states in the US have three different comprehensive consumer privacy laws: California (CCPA and its amendment, CPRA), Virginia (VCDPA), and Colorado (ColoPA). Regardless of which state a company is located in, the rights the laws provide apply only to people who live in these states.

“A lot of the provisions are business-model affirming. [VCDPA] essentially allows big data-gathering companies to continue doing what they have been doing.” —Kate Ruane, senior legislative counsel, American Civil Liberties Union

These laws have similar provisions that tend to give you some type of notice and choice in controlling your data. Essentially, a company operating under these regulations must tell you if it’s selling your data; you also get a choice in whether you’re okay with that or not, and you have the right to access, delete, correct, or move your data. These laws differ slightly in other ways, such as in the allowed cure periods (the amount of time a company has to correct a mistake), the size or income level of businesses the law applies to, and whether you can use tools or “authorized agents” for opt-out requests (such as a setting in your web browser that automatically opts you out of data sales on a web page, or a service where another person makes opt-out requests for you).

The experts we spoke to referred to California’s privacy protections as the strongest in the US, since the regulations include a limited “private right of action”—the ability to sue a company—against certain types of data breaches. California also requires a “global opt out” to remove one’s self from data sharing by device or browser, instead of being forced to opt out on each site individually. In contrast, some of the experts we spoke with viewed Virginia’s Consumer Data Protection Act with skepticism. “I would consider [VCDPA] a pretty weak bill,” said Ruane at the ACLU. “It is based on opt-out consent. There are no civil-rights protections. There is no private right of action. A lot of the provisions are business-model affirming. It essentially allows big data-gathering companies to continue doing what they have been doing.” None of that should be too surprising considering that Virginia’s law was written with strong input from Amazon.

At least four other states, Massachusetts, New York, North Carolina, and Pennsylvania, have serious comprehensive consumer data privacy proposals in committee right now. Other states have varying laws in the early stages. It can be difficult to follow the status of all these proposals, but the International Association of Privacy Professionals has a tracker that shows which states have privacy legislation in progress and where those bills are in the process. According to research from The Markup, at least 14 of the proposals are similar to Virginia’s weaker law.

As with the national laws, there are state-level laws that carve out coverage of individual aspects of data privacy. Missouri has ebook privacy rules. The Illinois Biometric Information Privacy Act (BIPA) gives people privacy rights over their biometric data, such as their fingerprint or face scans. When it comes to data-breach notifications, it’s particularly hard to know your rights, with at least 54 different laws that vary by region.

Amie Stepanovich of the Silicon Flatirons Center noted that such state laws are still useful, even if they can get confusing. “You can think of them as raising the water level,” she said, adding that companies often choose “to apply the stronger, more protective standard across the board for everyone” when legal standards go up.

There’s also a risk of too many state laws generating confusion, both operationally for companies and practically for consumers. Whitney Merrill, a privacy attorney and data protection officer, said that a federal law would make matters easier for everyone. “We need a federal law that thinks about things in a much more consistent approach,” Merrill said, “to make sure that consumers understand and have the right expectation over rights that they have in their data.”

Four areas that deserve basic protections, according to privacy experts

Everyone we spoke with described potential consumer data privacy laws as the “floor,” where it would be possible to build upon them in the future as new technologies spring up. This floor typically encompasses a few basic protections:

Data collection and sharing rights: Laws should give people the right to see what data various companies have collected on them, to request that companies delete any data they’ve collected, and to take data easily from one service to another. This also includes the right to tell companies not to sell (or share) your data to third parties. To get an idea of how this kind of regulation works in practice, we looked at what it’s like to request information in California under the CCPA, which tends to require that you click through at least one form on every single website you interact with (and for some third parties you may not even know exist).

Opt-in consent: A company should have to ask you if it may share or sell your data to third parties. You shouldn’t have to spend hours opting out of the collection of your private data through every service you use.

Data minimization: A company should collect only what it needs to provide the service you’re using.

Nondiscrimination and no data-use discrimination: A company shouldn’t discriminate against people who exercise their privacy rights; for example, the company can’t charge someone more for protecting their privacy, and the company can’t offer discounts to customers in return for their giving up more data. This regulation should also include clarification about civil-rights protections, such as preventing advertisers from discriminating against certain characteristics.

Merrill would also like to see a more comprehensive data-breach notification law, perhaps as a standalone bill. “I think that’d be a pretty easy thing to pass,” she said. “Who gets notified? What are the common standards? Let’s make it easy so everyone is on the same page.”

“Especially in those states where they don’t allow a private right [to sue], to then also underfund the public enforcement—it’s just an insult to injury.” —Hayley Tsukayama, legislative activist, Electronic Frontier Foundation

No regulation means much without an enforcement mechanism. And lobbyists have contested a “private right of action”—letting an individual sue a company over privacy violations—as one such mechanism. California’s law has a limited private right of action related to negligence with regard to a data breach. The Colorado and Virginia laws don’t even have that. Several bills, including those in Connecticut, Florida, Oklahoma, and Washington, failed to become laws because they included a private right of action. In early 2021, lawmakers in North Dakota introduced a bill that included a private right of action and opt-in consent, and in response a group of advertising companies (PDF) claimed: “Such an approach would create the most restrictive privacy law in the United States.” The bill failed in the state house.

Hayley Tsukayama, a legislative activist at the Electronic Frontier Foundation, described the situation bluntly. “We would like to see full private rights of action in privacy legislation,” she said. “We just think if a company violates your privacy, you should be able to sue them.”

“Historically, marginalized communities have not been able to rely on public institutions to vindicate their rights,” Stepanovich said. “So having something like a private right of action for Black communities and for other communities that are not white ensures that they can enforce their own rights or go to court when something has gone wrong.”

Soltani, in contrast, saw a way forward without the private right of action: “I think enforcement is a really important facet. If there’s adequate enforcement—legal protections and regulatory resources—I don’t think it’s a dealbreaker to forgo a private right to action.”

Those resources are important. “Especially in those states where they don’t allow a private right [of action], to then also underfund the public enforcement—it’s just an insult to injury,” Tsukayama said. California created an enforcement group just for this purpose called the California Privacy Protection Agency, which will receive $10 million in annual funding. The Virginia state attorney general’s office handles enforcement there with $400,000 in funding, supplemented with fines and penalties.

Throwing money at enforcement or requiring companies to adapt to new rules also requires people to do the work, and those people aren’t always readily available. “One of my concerns with state laws is that it’s more and more stuff to learn,” Merrill noted, “and I’m afraid of burnout in the privacy community because it’s impossible to keep up, and the stakes are so high.”

The Internet Association, an industry group that represents several big tech companies, including Amazon, Facebook, and Google, pointed us to a letter and testimony sent to the New Jersey legislature that focuses on two points: consent and private right of action. The association is pushing for the current opt-out consent model to maintain the status quo, in which consumers have to go out of their way to get the privacy protections outlined in the law. The association also included a paper from the Institute for Legal Reform, an affiliate of the US Chamber of Commerce that advocates for business-friendly legal reforms, which claims that private lawsuits would hinder innovation, cost too much money, and lead to inconsistent rulings.

How stronger privacy laws would change your day-to-day experience

If you’ve ever clicked through one of those annoying “cookie” notifications or been forced to scroll to the end of a privacy policy before you can use software, you’ve had a glimpse at how such laws can have a detrimental effect on your day-to-day experience.

It doesn’t have to be this way. Stepanovich said that if a privacy law is well written, most people’s lives shouldn’t change. “Privacy isn’t about not using tech, it’s about being able to participate in society and knowing your data isn’t going to be abused, or you’re not going to have some harm down the road because of it,” she said. Done right, the sorts of consequences from scandals like those surrounding Cambridge Analytica or Grindr could be minimized. And you’d see fewer personalized ads and more contextual ones, which are arguably less creepy (subscription required to read article), anyway.

A well-written data privacy law would make it easier for you to buy many of the products you’re curious about without needing to worry about the privacy concerns of doing so. Perhaps Wirecutter reviews and guides wouldn’t need in-depth comparisons assessing the privacy policies for running watches, smart scales, or robot vacuums, because they’d all have a baseline of privacy, as well as clear, easy-to-understand opt-in rules for sharing data. And if a company messes up and abuses those privacy rights, that company would be held accountable for a change.

Even the latest laws leave out all sorts of other data concerns, such as algorithm transparency or government use of facial recognition.

One sticking point of the current opt-out system is notification fatigue. When every app and website is asking you for dozens of permissions, it becomes easier to accept the status quo than to manually opt out of every tracking technology. A review article in Science (PDF) in 2015 highlighted just how poorly most people performed in navigating privacy risks, and a 2019 paper described the sort of “notice and choice” consent that everyone is used to as “a method of privacy regulation which promises transparency and agency but delivers neither.”

All of the experts we spoke with preferred an opt-in consent model and “privacy by default” concepts. Such an arrangement would make accounts private initially, and apps wouldn’t have any permissions. It would be up to you to opt into those settings. Alongside the right to sue companies, opt-in consent is proving to be one of the hardest things to get into privacy laws. In place of that, experts are pushing for the ability to use browser extensions or other tools that opt out automatically.

Ashkan Soltani, the former chief technologist at the FTC, has proposed a technical solution with Global Privacy Control (GPC), which provides a way to opt out of the sale of data at the browser or device level—an improvement over the need to opt out at every site or on every service. GPC is currently included in a handful of browsers and is respected by several publications, including The New York Times. California will more explicitly require businesses to honor GPC once its “global opt out” rules go into effect in 2023.

The impact of these types of laws could even reverse some of the “privacy is dead” despair that many people feel, as Amie Stepanovich noted. “You want that hopelessness to go away and for people to know: You are being protected while you’re doing this activity.”

The basic privacy laws being advocated for, proposed, and sometimes passed can’t and won’t fix everything. Given the complexity of the data economy that now exists, there’s plenty more that could and arguably should be done. Even the latest laws leave out all sorts of other data concerns, such as algorithm transparency or government use of facial recognition. There are several national privacy laws in various stages of legislation, but none that have a serious chance of passing anytime soon.

But new laws could at least encourage less privacy-hostile products and services, and they could provide basic protections (and enforcement) against the most harmful types of data mining, as well as form a baseline for more privacy protections in the future. At its best, a data privacy law could make it so that you can buy the latest gizmos with fun new features without having to fret over the fact that the company is collecting more data than you realize and selling it to companies you’ve never heard of to be used by advertisers to market to you.

Source: NY Times

Contact Us

Get In Touch

Got a question or comment about our services? Let's talk.