Skip to content

The Illusion of Consent: Rethinking Privacy Online

This Article analyzes the notice-and-choice approach: what it is, why it does not work, and how changes to the current approach can fix the problem.

Think of the last time you visited a new website. You were likely greeted with a pop-up asking you to accept terms and conditions or agree to some privacy policy. The vast majority of us scroll past these bothersome notices, eagerly searching for the box to check so we can proceed. In those quickly forgotten moments, we consent to an exchange that we barely acknowledge. After all, who has the time to read privacy notices? You would be hard-pressed to find someone who has actually read one from start to finish—let alone the dozens of notices one may interact with in a day or week. This routine action completes an exchange mandated by decades of American privacy law designed to protect our individual autonomy and privacy. Companies present users with terms of service, and users have the power and control—in theory—to choose whether to agree to the terms and use the service. This exchange—which is responsible for legitimizing the vast majority of any organization’s data practices—has been characterized among privacy professionals as “notice-and-choice.”[1]

Despite the prevalence of this notice-and-choice framework, it is inherently flawed. The sheer volume and complexity of privacy notices overwhelm users, making genuine informed consent virtually impossible.[2] Instead of fostering transparency and control, these practices often obscure the true nature of the data exchange.[3] As a result, our reliance on this system fails to adequately safeguard personal information in an increasingly data-driven world. Why do we still rely on this system?

The idea that consent can legitimize nearly any personal data exchange stems from centuries of American privacy laws, from early wiretapping laws and HIPAA to state privacy bills awaiting approval today.[4] Almost all these laws approach privacy protection based on individual control, presuming that individuals can protect their privacy when given control over it.[5] Alternatively, some privacy laws exempt would-be violations where there is individual consent.[6] This tradition allows individuals to contract away their privacy as long as they are given notice.[7] These forms substitute the user’s privacy expectations with whatever is outlined within, even if hastily clicked through.[8]

Our hurry in the exchange, and our little concern for it, presents a major issue. One-hundred percent of Americans rely on notice-and-choice to protect their online privacy, but most do not bother to read privacy notices.[9] What are we agreeing to? Can we consent to an agreement we do not understand? The notice-and-choice approach to privacy protection leaves advocates wanting and Americans unprotected.

This Article will analyze the notice-and-choice approach: what it is, why it does not work, and how changes to the current approach can fix the problem.

I. What is Notice-and-Choice?

Notice-and-choice is a foundational concept in American privacy law. It operates on the premise that by providing users with notice of data collection practices and obtaining their consent, companies can legitimize their data practices.[10] The idea is that informed users can make choices that align with their personal privacy preferences.[11] However, this framework assumes that users have the time, interest, and expertise to understand complex privacy policies, which is almost never the case.

When a user clicks “I agree” on a privacy notice, they are agreeing to a lengthy document filled with legal and sometimes technical jargon. These documents outline how the company collects, uses, sells, and stores information about the user.[12] In theory, this knowledge gives users control over their personal information. In reality, few people read these documents, and even fewer understand them.[13] This disconnect between the intention of notice-and-choice and the actual user experience is a major flaw in the system.

Notice-and-choice is deeply embedded in existing American privacy laws, reflecting the country’s long-standing affinity for contractual freedom and the power to contract.[14] For instance, HIPAA mandates that healthcare providers give patients notice about their privacy practices and obtain patients’ consent for sharing medical information.[15] However, in a medical setting, a conversation with a doctor provides better privacy notice than the lengthy click-through form on an app. Similarly, the Family Educational Rights and Privacy Act (FERPA) requires educational institutions to obtain a student or parent’s consent before disclosing personal education information.[16]

American lawmakers favor the notice-and-choice model because it closely resembles a valid contractual exchange.[17] In American law, there is a strong affinity for the power to contract, which is seen as a fundamental aspect of liberty, individual autonomy, and economic freedom.[18] By framing privacy protection as a matter of contract, lawmakers can uphold the principle that individuals should have the right to control their personal information by appealing to ideals of liberty.[19] This approach to protecting privacy, therefore, aligns with the broader American legal tradition of prioritizing individual rights and personal responsibility by way of contract.

But applying a contractual framework to privacy protections has significant limitations. The complexity and length of privacy notices means that genuine informed consent is rarely achieved.[20] The burden is placed on individuals to read and understand these notices, even though most people lack the legal or technical expertise to fully comprehend them.[21] Furthermore, the overwhelming volume of privacy notices that users encounter daily makes it impractical for anyone to thoroughly review each one.[22]

One product of this dynamic is a shocking disparity in bargaining power between the user of an online service and the service provider themselves; after all, even if one were to read every privacy notice for each online service they use and found some data practices objectionable, what are the odds that the user will simply give up and abandon whatever objective that brought them to that service?[23]

For example, how many people would delete their profile from their favorite social media platform after discovering objectionable privacy practices? Would Americans stop shopping at their favorite retail stores if they knew their cart contents were being monitored, their body measurements captured, and their face print captured and analyzed all for a “better” shopping experience?[24]

Given these limitations, perhaps individual consumers are not in the best position to protect American privacy rights.

II. How Notice-and-Choice Fails

The promise of notice-and-choice hinges on seemingly straightforward logic: (1) service providers draft clear and comprehensive terms and conditions, (2) they present them to users in an understandable format, and users, after reading and comprehending these terms, (3) provide their informed consent.[25] In an idyllic world, this process would ensure mutual agreement between well-informed consumers and the services they engage with. In practice, however, the logic fails at each step of its implementation.

A. Complexity and Incomprehensibility of Terms

The first prong of notice-and-choice assumes that terms and conditions are accessible and understandable to the average user. In reality, these documents are often dense with legal and technical jargon, rendering them incomprehensible to the general public.[26] Cornell professor and privacy scholar Helen Nissenbaum refers to this phenomenon as the “transparency paradox,” where increased transparency through detailed disclosures leads to decreased readability and comprehension.[27]

The transparency paradox emerges as companies attempt to increase transparency in their privacy disclosures.[28] To ensure regulatory compliance, reduce liability, and even promote trust, companies publish extensive details about their data collection and usage practices in user-facing privacy notices.[29] However, this level of detail, while theoretically transparent, quickly becomes counterproductive—the more information included, the more complex and overwhelming the document becomes for the average user. Thus, the paradox: increasing transparency decreases digestibility.[30] Instead of achieving clarity, the document’s length and complexity obscure the key information users need to make informed decisions. And, those informed decisions, which create informed parties to an agreement, are necessary to complete the valid exchange promised by notice-and-choice’s apparently symbolic logic.

There, at its start, the logic of notice-and-choice fails. In the context of online services, informed consent is impossible because individual users cannot understand what they are agreeing to—and often, do not even know that they are agreeing to anything. If it follows that achieving informed consent is impossible, then the entire approach is inoperable.

B. Unrealistic Expectations of User Behavior

Perhaps the broadest issue with notice-and-choice is that it places so much burden on the user. Even if companies simplified their terms and conditions, users would still need to diligently read and evaluate each document before consenting to satisfy the second prong of notice-and-choice. But in the fast-paced digital environment where services are accessed with a click or a tap, expecting users to pause and carefully review legal agreements is nonsensical. Users are typically focused on accessing the desired service quickly and efficiently, often bypassing lengthy terms and conditions in favor of expediency.[31]

Research supports this notion. A 2023 study by NordVPN found that the average privacy policy in the United States consists of 6,938 words.[32] The average person reads around 238 words per minute, meaning it would take them almost thirty minutes to read an average privacy policy.[33] Reading the privacy policies of the twenty most popular United States websites would take over nine hours.[34] And “reading the privacy policies of the [ninty-six] websites a person typically visits in a month would take longer than a full workweek—46.6 hours.”[35] The study notes that “[i]f you spent this time working for federal minimum wage, you would earn about $349.50.”[36]

Moreover, companies design their online interfaces to prioritize ease of use rather than enabling thorough consideration of legal documents, with interfaces ranging from manipulative skip-throughs to plainly deceptive traps.[37] Many companies create consent mechanisms, such as “click-to-accept” buttons, to streamline user experience, encouraging quick acceptance rather than careful review.[38] The default settings on many platforms further exacerbate this issue, typically opting users into data collection practices by default and requiring extra effort to opt out.[39] Additionally, many online services employ what are known as “dark patterns”—deceptive interface designs that coerce users into providing consent.[40] These can include bright, big buttons for agreeing to terms and conditions contrasted with small, hard-to-find links for opting out, or misleading language that nudges users towards making choices that benefit the service provider at the expense of user privacy.[41]

These tactics manipulate user behavior, further undermining the notion of genuine, informed consent.[42] The use of dark patterns in online interfaces has increasingly been a subject of concern for regulators. The Federal Trade Commission (FTC) has targeted these deceptive design practices, as have state regulatory bodies.[43] However, the FTC’s enforcement power in the online space under its Section 5 authority has become increasingly scrutinized, and Americans cannot rely on the FTC’s already stretched resources (and perhaps questionable jurisdiction) to carry the vast burden of protecting online privacy.[44] These agency actions illustrate at least a recognition that notice-and-choice is failing, and that truly informed and voluntary consent is more hopeful than it is realistic.

Consent to one thing does not imply consent to all things. The principle of informed consent requires that individuals are fully aware of and agree to specific terms and practices.[45] If a company updates data practices or privacy policy, it should remain within the scope of what was originally agreed upon. This would ensure that the scope of a company’s data practices remains aligned with the scope of user consent. Thus, the consumer’s burden is further heightened. After all, a company has the “right to change its policy at will, giving due notice of such change, ironically, within the policy itself and therefore requiring interested individuals to read it not once but repeatedly.”[46]

The constant evolution of how companies use customer data challenges the idea that a one-time user consent is sufficient.[47] What results from this dynamic is a gap between what users originally consented to and what is actually happening with their data. For example, a company may initially disclose limited data sharing with third parties in its privacy policy. Over time, as the company grows and its business model evolves, it might expand data-sharing arrangements. If these changes are not quickly and clearly communicated to users, the original consent becomes invalid. Users remain unaware of new practices that could significantly impact their privacy, effectively nullifying the concept of informed consent.

Moreover, companies’ notifications to users of changes to their privacy policies are often insufficient. Policy update notifications are typically buried in emails or displayed as brief pop-ups that users may ignore or overlook. This lack of effective communication further erodes the concept of informed consent, as users remain completely in the dark about significant changes that impact them.

III. What’s the Solution?

Despite its promising ideals and symbolic rhetoric, the notice-and-choice framework has failed to protect individual privacy. It struggles to keep pace with rapidly evolving threats to privacy and increasingly sophisticated data practices that can glean comprehensive and deeply personal insights about individuals.[48] Amidst these challenges, however, alternative mechanisms of online privacy protection exist, outside of and in addition to the notice-and-choice framework. A couple of transformative solutions could effectively fill in the holes left behind by our current approach.

One promising solution involves bolstering legal protections that complement or even replace notice-and-choice. For instance, state privacy laws that implement data minimization principles could serve as a first line of defense.[49] These principles would limit an app or website’s collection of personal data to only what is necessary for the service’s specified purpose, thereby reducing the risk of data breaches and misuse from the outset by simply limiting the amount of personal data collected.[50] By prioritizing data minimization, lawmakers can shift the focus from user consent to proactive risk reduction, aligning privacy laws more closely with technological realities.

The burden is also shifted from users to the organizations in control of data. Instead of asking the users to do anything, let alone read a lengthy privacy notice, data minimization requires companies to filter the information they collect and process, generally limiting unfettered data practices.[51] For example, California’s Consumer Protection Act requires that companies’ collection and handling of personal information is “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed.”[52] Although California’s prominence in the national economy has required compliance from most companies operating in the American market, citizens outside of the state cannot rely on California for the same protection. State privacy laws, like California’s CPA, only confer positive privacy rights on its residents; meaning, an out-of-state resident has no cause of action against violators of a state privacy law not enacted by their home state.[53] For that reason, these principles need nationwide adoption.[54]

Effective data minimization principles can help reduce faulty reliance on notice-and-choice by providing an effective baseline protection for personal data.[55] The limitations decrease the risks of privacy harm by simply decreasing the points of opportunity. In theory, less data means less risk; both to consumers and data controllers. Further, where questionable “consent” would suffice in notice-and-choice, implementing data minimization approaches would prohibit unreasonable data practices altogether, thus protecting individuals from unknowingly consenting to unfavorable terms.

B. Adopting a Fiduciary Model of Privacy

Another innovative proposal is to rethink how we conceptualize personal data through a fiduciary model. This approach would treat companies as fiduciaries entrusted with safeguarding user data.[56] Like trustees managing assets on behalf of beneficiaries, companies would owe a duty of loyalty and care to users, requiring them to prioritize user interests over their own financial gain.[57] This model could shift the burden from users’ shoulders to companies’, fostering greater accountability and trust in data practices.[58]

In his Fiduciary Model for Privacy, Yale law professor and privacy scholar Jack M. Balkin explains that the law prescribes fiduciary duties in specific situations.[59] Typically, he says, the law looks to whether the stronger party has invited the weaker party to trust them, with an eye towards the asymmetry of power.[60] In these relationships, there are duties of care, confidentiality, and even loyalty towards those who have chosen to trust a powerful party from a vulnerable situation.[61] For example, doctors have a special kind of relationship as a steward with their patients.[62] We place an immense amount of trust in our doctors, partly because we are much more aware of what is at risk.[63] Doctors then accept a heightened imposition of responsibility, in exchange for our trust and out of recognition of their patient’s more vulnerable position.[64]

Professor Balkin suggests that internet users and massive technology firms are situated in a similarly asymmetrical way.[65] Today, technology companies can collect more information and learn more about users’ private lives than a doctor ever could.[66] Further, these firms have technology and computing power at their disposal to learn users’ most intimate personal preferences or even engage in manipulative behavior.[67] Meanwhile, the only protection left to the American consumer is personal choice, the choice to not use online services with objectionable data practices, should one even be aware of those practices.

In response to this vulnerability and disparity in bargaining power, imposing fiduciary duties on companies handling personal data could radically change our protection against predatory data practices. By establishing fiduciary responsibilities, the government could legally require technology companies to act in the best interests of their users, thereby removing the burden from the vulnerable consumer and placing it on the data controllers. Whereas now, companies may chalk up regulatory fines as “the cost of doing business,” imposing traditional fiduciary duties (owed to the user, or data subject) would impose liability to a degree well beyond even the largest regulatory fines.

Further, translating traditional fiduciary duties to data collection, processing, and transfer would empower every American to seek damages, actual or statutory, in the event that a controller fails to properly safeguard a consumer’s personal information. This shift in responsibility would ensure that companies prioritize user privacy and keep an eye out for ethics and diligence in handling data.

Conclusion

The notice-and-choice framework has failed to advance any American interest in individual privacy, largely because the informed consent it relies on is nearly impossible in the digital context. That failure leaves individuals vulnerable and unprotected in an increasingly data-driven world. By looking for an alternative approach, we can fundamentally transform the way we think about our rights and our individual privacy. This would create a safer online ecosystem for everyone and a more effective regime of privacy protection that promotes equity online that shifts the burden of protection away from consumers and towards the online services which stand to gain the most from the interaction.


  1. Richard Warner & Robert H. Sloan, Beyond Notice and Choice: Privacy, Norms, and Consent, 14 J. High Tech. L. 370, 373 (2013). ↩︎

  2. Daniel Susser, Notice After Notice-and-Consent: Why Privacy Disclosures Are Valuable Even If Consent Frameworks Aren’t, 9 J. Info. Pol’y 148, 154, 156 (2019). ↩︎

  3. Id. at 157. ↩︎

  4. See, e.g., 18 U.S.C. § 2511 (A communication is not unlawfully intercepted under federal wiretap law where at least one party to the communication consents to the interception) (emphasis added); 45 C.F.R. § 164.508 (2025) (HIPAA “Privacy Rule” allowing for disclosure pursuant to patient authorization). ↩︎

  5. Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 Harv. L. Rev. 1880, 1880 (2013). The idea that individual control enables privacy protection was first offered in the late nineteenth century in a now famous law review article written by Samuel D. Warren and Louis D. Brandeis (later, Justice Brandeis). Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 Harv. L. Rev. 193, 200 (1890). ↩︎

  6. See, e.g., Cal. Civ. Code § 1798.135(b)(1) (Deering 2024). ↩︎

  7. Warner & Sloan, supra note 1, at 379–80. ↩︎

  8. Id. at 381. ↩︎

  9. Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar & Erica Turner, Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information 5 (2019), https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2019/11/Pew-Research-Center_PI_2019.11.15_Privacy_FINAL.pdf [https://perma.cc/UTZ2-8PV2\]; Colleen McClain, Michelle Faverio, Monica Anderson & Eugenie Park, How Americans View Data Privacy 8 (2023), https://www.pewresearch.org/internet/2023/10/18/how-americans-protect-their-online-data/ [https://perma.cc/LEQ7-4MMY\]. ↩︎

  10. Solove, supra note 5. ↩︎

  11. Id. ↩︎

  12. Bex Evans, What to Include in Your US Privacy Notice, Onetrust (June 13, 2023), https://www.onetrust.com/blog/what-to-include-in-your-us-privacy-notice/ [https://perma.cc/3GVP-DD47\]. ↩︎

  13. See text accompanying supra note 9; Solove, supra note 5, at 1886. ↩︎

  14. Warner & Sloan, supra note 1, at 381. This is largely a result of the fact that the approach closely resembles a valid contractual exchange. Id. For example, a website provider offers its services to a user. The user’s data (valuable to the website provider for many reasons) can then suffice as consideration in the exchange, as well as the value of the website’s services that the user receives in exchange for their data. Finally, the privacy notice shown to the user then provides more details to give the user a better understanding of the exchange—resulting in mutual assent and an affirmative acceptance (via clicking a box, toggling a slider, etc.) to a valid offer. Id. at 382. ↩︎

  15. 45 C.F.R. §§ 164.508, 164.520 (2025). ↩︎

  16. 20 U.S.C. § 1232g(b). ↩︎

  17. Susser, supra note 2, at 153–54. ↩︎

  18. See, e.g., Lochner v. New York, 198 U.S. 45, 53 (1905). ↩︎

  19. Solove, supra note 5, at 1882–83. ↩︎

  20. Susser, supra note 2, at 155. ↩︎

  21. Id. ↩︎

  22. Id. at 156. ↩︎

  23. Id. at 154. ↩︎

  24. Ban Facial Recognition in Stores, Fight for the Future, https://www.banfacialrecognition.com/stores/ [https://perma.cc/DKR5-PZUL\] (listing major retailers and whether they use facial-recognition technology to analyze their customers); Sarina Trangle, Body Scanners, AI Mirrors and Smart Carts: Go Inside the Latest in Shopping Tech, Investopedia (Jan. 14, 2025, 1:09 PM), https://www.investopedia.com/body-scanners-ai-mirrors-smart-carts-inside-the-latest-in-shopping-tech-8773854 [https://perma.cc/4QW3-WXA5\]. ↩︎

  25. Warner & Sloan, supra note 1, at 373–74. ↩︎

  26. Helen Nissenbaum, A Contextual Approach to Privacy Online, Daedalus, Fall 2011, at 32, 35. ↩︎

  27. Id. at 36. ↩︎

  28. Id. ↩︎

  29. Evans, supra note 12. ↩︎

  30. Nissenbaum, supra note 26, at 36. ↩︎

  31. McClain, et al., supra note 9 at 28. A 2023 PEW Research Center survey found that 69% of U.S. adults “say privacy policies are just something to get past” while only 27% “consider these policies a meaningful part of their decision to use a product or service.” Id. Those who had received formal education were even more likely to skip past these policies, at a rate of 77%. Id. ↩︎

  32. Irma Šlekytė, NordVPN Study Shows: Nine Hours to Read the Privacy Policies of the 20 Most Visited Websites in the US, NordVPN (Oct. 23, 2023), https://nordvpn.com/blog/privacy-policy-study-us/ [https://perma.cc/F432-TV7S\]. ↩︎

  33. Id. ↩︎

  34. Id. ↩︎

  35. Id. ↩︎

  36. Id. ↩︎

  37. Ari Ezra Waldman, Privacy, Notice, and Design, 21 Stan. Tech. L. Rev. 129, 173 (2018). ↩︎

  38. Sara Pegarella, Examples of “Click to Accept”, TermsFeed (May 12, 2024), https://www.termsfeed.com/blog/examples-click-accept/ [https://perma.cc/42E7-T89T\]. ↩︎

  39. Sarah Rippy, Opt-in vs. Opt-out Approaches to Personal Information Processing, IAPP (May 10, 2021), https://iapp.org/news/a/opt-in-vs-opt-out-approaches-to-personal-information-processing/ [https://perma.cc/DFZ5-KGLP\]. ↩︎

  40. Fed. Trade Comm’n, Bringing Dark Patterns to Light 2 (2022), https://www.ftc.gov/system/files/ftc_gov/pdf/P214800 Dark Patterns Report 9.14.2022 - FINAL.pdf [https://perma.cc/6B29-X886\]. ↩︎

  41. Id. at 23. ↩︎

  42. Id. at 2. ↩︎

  43. Rohit Chopra, Commissioner, Fed. Trade Comm’n, Statement of Commissioner Rohit Chopra Regarding Dark Patterns in the Matter of Age of Learning, Inc., (Sept. 2, 2020), https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse\_-\_rchopra_statement.pdf [https://perma.cc/M2EL-FBQH\]; Cal. Civ. Code § 1798.140(h) (Deering 2024). ↩︎

  44. Compare Daniel Solove & Woodrow Hartzog, Should the FTC Be Regulating Privacy and Data Security?, TeachPrivacy (Nov. 14 2014), teachprivacy.com/ftc-regulating-privacy-data-security/ [https://perma.cc/58F6-WVHZ\] (arguing “that the FTC not only has the authority to regulate data protection to the extent it has been doing, but it also has the authority to expand its reach”), with Alden Abbot, The Federal Government’s Appropriate Role in Internet Privacy Regulation, The Heritage Found. (Oct. 27 2016), www.heritage.org/report/the-federal-governments-appropriate-role-internet-privacy-regulation [https://perma.cc/J67A-6W2H\] (arguing that the FTC’s regulation of online privacy exceeds its Section 5 authority). ↩︎

  45. Waldman, supra note 37, at 151. ↩︎

  46. Nissenbaum, supra note 26, at 35. ↩︎

  47. Solove, supra note 5, at 1890. ↩︎

  48. Susser, supra note 2, at 157. ↩︎

  49. As of May 2024, thirteen of the seventeen enacted U.S. state privacy laws require some level of data collection limitations on the part of the service provider or data controller. Jordan Francis, Unpacking the Shift Toward Substantive Data Minimization Rules in Proposed Legislation, IAPP (May 22, 2024), iapp.org/news/a/unpacking-the-shift-towards-substantive-data-minimization-rules-in-proposed-legislation [https://perma.cc/NU3G-4TXE\]. ↩︎

  50. See, e.g., Cal. Civ. Code § 1798.100(c) (Deering 2024) (requiring business use of personal information be necessary, proportionate, and compatible with disclosed purposes). Though privacy harms can often be speculative, data breaches can present risks to individuals ranging from identity theft to alienizing protected classes and enabling hate crimes. See also, Kevin Collier, 23andMe User Data Targeting Ashkenazi Jews Leaked Online, NBC News (Oct. 7, 2023, 10:46 AM), www.nbcnews.com/news/us-news/23andme-user-data-targeting-ashkenazi-jews-leaked-online-rcna119324 [https://perma.cc/EPJ3-QX63\]. ↩︎

  51. Mohammed Khan, Data Minimization—A Practical Approach, ISACA (Mar. 29, 2021), https://www.isaca.org/resources/news-and-trends/industry-news/2021/data-minimization-a-practical-approach [https://perma.cc/KL7S-VQ22\]. ↩︎

  52. Cal Civ Code § 1798.100(c) (emphasis added). ↩︎

  53. Cal. Civ. Code § 1798.140(i) (Deering 2024) (defining “consumer” as a California resident). ↩︎

  54. Yet, many privacy advocates oppose an omnibus federal privacy law, citing concerns that whatever compromise may be arrived at will likely produce a less strict approach than that of state laws like California’s CCPA, thus weakening already existing frameworks. See Jayne Ponder, Senate Discusses a Federal Privacy Law with Privacy Experts: Examining Lessons From the European Union’s General Data Protection Regulation and the California Consumer Privacy Act, Nat’l L. Rev. (Oct. 17, 2018), https://natlawreview.com/article/senate-discusses-federal-privacy-law-privacy-experts-examining-lessons-european [https://perma.cc/TN9S-67ZV\] (reporting how cybersecurity experts pushed the U.S. Senate to make federal protections a “‘floor,’ not a ceiling.”). ↩︎

  55. Avi Gesser, Matthew Kelly, Will Schildknecht, Dr. Vera Jungkind, & Dr. Carolin Raspé, A 14.5 Million Euro Fine for Failing to Get Rid of Old Files—Data Minimization Is Becoming a Stand-Alone Cybersecurity Obligation, Program on Corp. Compliance and Enf’t (Dec. 6, 2019), https://wp.nyu.edu/compliance_enforcement/2019/12/06/a-14-5-million-euro-fine-for-failing-to-get-rid-of-old-files-data-minimization-is-becoming-a-stand-alone-cybersecurity-obligation/ [https://perma.cc/5598-ZALG\]. ↩︎

  56. Jack M. Balkin, The Fiduciary Model of Privacy, 133 Harv. L. Rev. F. 11, 13–14 (2020). ↩︎

  57. Id. at 14. ↩︎

  58. Id. at 16. ↩︎

  59. Id. at 13–15. ↩︎

  60. Id. at 13–14. ↩︎

  61. Id. at 14. ↩︎

  62. Balkin, supra note 56, at 15. ↩︎

  63. Id. ↩︎

  64. Id. ↩︎

  65. Id. at 15–16. ↩︎

  66. Id. at 15. ↩︎

  67. Id. at 15; see also Aaron Nathans, Episode 1: How Consumer Tech Can Manipulate You (and Take Your Data), Princeton Eng’g: Cookies Tech Sec. & Priv., (Sept. 15, 2020), https://engineering.princeton.edu/news/2020/09/15/episode-1-how-consumer-tech-can-manipulate-you-and-take-your-data [https://perma.cc/LJ3Y-3PGW\] (episode with Arvind Narayanan) (discussing intrusive practices like real time tracking via geolocation within retail stores, cross-platform and cross-device tracking creating scarily accurate profiles about ourselves, and both the incentives and techniques of business’s to actually shape consumer’s interests and desires instead of merely reacting to those interests). ↩︎