Orange is the New Black, and Data is the New Oil 

In the modern economy, data has supplanted oil as the most valuable commodity. But unlike oil harvested from the earth, data is harvested from us—every search, scroll, and swipe becomes fuel for a society that is sprinting at breakneck speed to integrate AI at every possible point of human contact. Economies are no longer just goods-based or service-based, they are now algorithm-based. For tech companies, more data means better AI, more personalized services, and ultimately, greater profits. But for individuals, this means increased surveillance and frequently opaque repurposing of our most personal information. 

Social media platforms like TikTok, Instagram, and Facebook track nearly every user interaction. Algorithms optimize feeds by learning from what we like, how long we pause, what we ignore, and even where our eyes are drawn to on our screens. This data teaches AI systems to predict and influence our behavior with startling precision. While convenient, this personalization relies on collecting vast amounts of data—often without users’ informed consent or awareness of how their data will be reused [1]. The more data we offer up, the more refined these predictive models become. However, that same precision comes with ethical baggage. What begins as user engagement can quickly slip into manipulation. 

Beyond social media, consumer apps and smart devices contribute to the data economy. A weather app might sell precise location data to advertisers [2]; a wearable might share your sleep or heart rate patterns with analytics firms. Smart TVs, speakers, thermostats, refrigerators, and personal home security cameras can collect and share information on viewing habits, voice commands, temperature preferences, food consumption, and what times of day you may be home. These practices raise serious questions about consent and control of all the data these connected devices are generating. The everyday user rarely understands the full scope of how their data is gathered, stored, shared, or monetized. The Cambridge Analytica scandal, in which millions of Facebook users’ data was harvested for political profiling without consent, and Clearview AI’s unauthorized scraping of facial images from social media, are cautionary tales of how easily data can be weaponized [3][4]. 

Legal frameworks are evolving in response. The European Union’s General Data Protection Regulation (“GDPR”) set a global benchmark by enshrining rights like access, deletion, and data portability, backed by fines up to 4% of annual global revenue [5]. GDPR also requires that companies have a clear legal basis for processing personal data. It further mandates privacy by design: a proactive approach to data protection that integrates privacy considerations and data safeguards into the design and architecture of systems and process from the outset of their development . GDPR’s extraterritorial scope means that even U.S. companies handling European data must comply. In the U.S., the California Consumer Privacy Act (CCPA) and its successor, the CPRA, have given Californians new powers to opt out of data sales, access collected data, and demand deletion. Other states are following suit, creating a growing patchwork of privacy laws [6]. Colorado, Virginia, Connecticut, Utah, and several others have passed their own statutes, leading to varying levels of consumer rights and business obligations across jurisdictions. 

Yet the absence of a comprehensive federal law in the U.S. creates confusion for both businesses and consumers. Proposed legislation like the American Data Privacy and Protection Act (ADPPA) shows promise but has yet to gain traction. Meanwhile, consumers are often left with few practical means of protecting their data, and businesses face uncertainty in compliance strategies. Most recently, the Consumer Financial Protection Bureau (“CFPB”), quietly killed a proposed rule introduced by the Biden Administration in December 2024 designed to limit the ability of U.S. data brokers to sell sensitive information about Americans. The CFPB published a notice in the Federal Register on May 14, 2025 stating that the rule was no longer “necessary or appropriate.” [7] 

So, what can be done? First, businesses must adopt data minimization principals—don’t drill for all the oil you can, only the oil you need to operate. Too often, companies collect superfluous data “just in case” it becomes useful, a practice that increases liability and user risk, and others still collect this data for the explicit purpose of selling that data to brokers on secondary markets. Second, ensure transparency: privacy policies should clearly articulate what data is collected, why, and how it will be used. Boilerplate disclosures buried in long-winded legalese are not enough. Third, provide users with meaningful control, including easy-to-use tools for opting out, deleting, or correcting their data. These mechanisms should be designed with user experience in mind, not as legal check-the-boxes.  

Security is another vital component. Collected data must be adequately protected against breaches, leaks, or insider threats. Encryption, access control, and regular security audits should be standard. Companies should also limit data retention to what is necessary and define deletion timelines. Data that doesn’t exist can’t be misused or stolen. 

And finally, legal professionals are essential in this ecosystem. We not only help companies comply with laws but also shape how businesses think about privacy and ethics. From conducting data impact assessments to advising on product design, lawyers serve as architects of responsible data governance. The role of in-house counsel, privacy officers, and compliance advisors is no longer a back-office function—it is central to strategic risk management. Moreover, attorneys advising consumers can help individuals navigate their rights, respond to breaches, and hold organizations accountable when they overstep. 

As the AI economy expands, so too must our commitment to privacy as a civil right. Legal frameworks need to be harmonized and enforced. Policymakers must listen to both technical experts and civil society advocates when shaping laws that govern data use. And companies must internalize privacy not just as a compliance issue, but as a pillar of trust and brand reputation. 

With strong legal frameworks, sound business practices, and active legal counsel, data can drive advancements without compromising human dignity. But if we fail to act, we risk becoming nothing more than fuel for the next algorithm—our lives distilled and refined into digital profiles, with little say in how we’re used. In this cutting-edge world, privacy must not become a relic. It must remain a right. 

Citations: 

[1] World Economic Forum, “Why AI design must prioritize data privacy,” 2023. 

[2] The Verge, “The Weather Channel app unlawfully obtained user location data, says prosecutor,” 2019. 

[3] The Guardian, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach,” 2018. 

[4] Reuters, “Clearview AI fined by Dutch agency for facial recognition database,” 2024. 

[5] GDPR.eu, “What is GDPR, the EU’s new data protection law?” 2023. 

[6] IAPP, “US State Privacy Legislation Tracker ” 2025. 

[7] Wired, “CFPB Quietly Kills Rule to Shield Americans From Data Brokers,” 2025 

More to explore