The Contract Conundrum: Avoiding Data Privacy and Data Protection Risks in Agreements

“Documenting how organizations handle personal data should reflect actual operations, not aspirations.”

Debbie Reynolds, "The Data Diva"

For over two decades, I have worked at the intersection of law, technology, and privacy, long before data protection and data privacy became mainstream corporate concerns. I have consistently observed across industries, regions, and regulatory frameworks that many organizations still underestimate the extent to which contracts contribute to their overall data privacy risk. As global attention shifts toward more emerging technologies like artificial intelligence, biometric surveillance, and novel uses of consumer data, organizations that continue to treat contracts as routine or static instruments, especially when they govern the handling of personal data, are setting themselves up for regulatory heartburn and reputational damage.

This “Contract Conundrum” is critical because of the growing misalignment between traditional contract management norms and evolving global expectations for data privacy and protection. Standard contracts often fail to consider or address the nuances of personal data handling, particularly regarding data subject rights, transparency, consent, sensitive data, and cross-border data transfers. The failure to modernize contractual language and ethos around these pillars is not just an operational oversight. It is a data privacy risk for organizations.

Here are four persistent Data Privacy and Data Protection 'contract conundrum' issues that heighten organizational exposure. Using real-world examples, they highlight what can go wrong when data privacy obligations are poorly integrated into contracts.

1. Being Too Vague About How Your Organization Protects Data

Many contracts governing data sharing, especially business-to-business agreements, still contain generalized language such as “reasonable safeguards” or “industry-standard protections” without clear, actionable details. While such language may have been sufficient a decade ago, it no longer meets the standards of modern data protection laws such as the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), or laws like the Illinois Biometric Information Privacy Act (BIPA).

Today’s regulators, courts, and consumers expect organizations to define how data is collected, processed, stored, and protected with specificity. This includes identifying encryption standards, data retention timelines (or criteria for determining it), access control mechanisms, and whether data is shared with subprocessors. Contracts that fail to reflect these specifics may expose the organization to non-compliance and undermine the enforceability of privacy-related or data protection-related obligations.

One frequent example of vagueness is in the data retention language of contracts. It is no longer acceptable under the GDPR to simply state that data will be retained "only as long as necessary." Article 13(2)(a) and Article 14(2)(a) of the GDPR require organizations to either specify how long data will be retained or, if that is not possible, to clearly describe the criteria used to determine the retention period. A generic statement without additional detail (or backup) fails to meet this standard. For instance, if a company retains data for varying durations depending on the type or purpose of processing, it should explain those differences and the underlying rationale. Providing no explanation or relying on vague language creates legal risk and undermines transparency obligations.

Consider the increasing prevalence of artificial intelligence in everyday business. If an organization uses AI models trained on consumer data, yet the contract fails to specify this use or outline associated data protection measures, the organization risks violating disclosure and purpose limitation principles. More importantly, vague language cannot shield an organization from public scrutiny or class-action lawsuits when that data is mishandled.

2. Confusion About When Consent Is Necessary for Using Sensitive Data

A common and consequential mistake organizations make is confusing notice with consent, particularly regarding sensitive data. This is especially prevalent in the U.S., where many traditional data use requirements generally only require notice for certain uses. This is no longer the case, as many US states have enacted privacy legislation that is increasingly making notice alone insufficient.

Simply informing a user about the data you collect and how you use it is not the same as obtaining their explicit consent, especially in jurisdictions with strict biometric or health data laws.

The recent lawsuit involving Photobucket is a prime example. Filed in the U.S. District Court for the District of Colorado in 2024, Pierce et al. v. Photobucket, Inc. et al. Case No. 1:24-cv-03432 alleges that Photobucket unlawfully licensed user-uploaded images, many of which contained facial and iris scans, to third-party AI companies without the users’ explicit consent. Plaintiffs argue that these actions violated biometric privacy laws in Illinois and California, breached Photobucket’s contractual terms, and constituted unfair consumer practices.

The case is further complicated by Photobucket’s alleged coercion of inactive users to accept new terms of service that retroactively allowed their images to be used for AI training. Consent obtained under duress or without adequate understanding does not satisfy the legal standard for handling sensitive data such as biometric identifiers. The fact that approximately 13 billion images were made accessible to third parties amplifies the potential harm, especially in an era when deepfake technology can exploit biometric data for malicious purposes.

What is essential here is understanding that notice alone is insufficient for data categories deemed sensitive, such as biometrics, health data, and sexual orientation. Organizations must obtain informed consent from data subjects before processing or sharing their sensitive data. Contracts that fail to include mechanisms for this type of consent expose the business to legal liability, as evidenced by this case.

3. Not Fully Understanding How Notice Can Differ When Dealing with Emerging Technology

Notice is not a static requirement. Organizations also need to consider the emerging technologies they are using. Too often, contracts include outdated or insufficient language around data use, particularly in scenarios involving AI surveillance, biometrics, facial recognition, or automated decision-making. Organizations deploying these technologies must ensure that the notice they provide to individuals clearly states not just that data is being collected, but how and why it is being used, especially if it is being analyzed or matched against AI algorithms.

The U.S. Federal Trade Commission’s recent action against Rite Aid serves as a cautionary tale. In late 2023, the FTC banned Rite Aid from using facial recognition surveillance technology for five years, citing reckless and discriminatory deployment. According to public reporting, the company used AI tools to identify shoplifters based on facial recognition data, often relying on outdated, racially biased, or misleading photos. Consumers and employees were not given clear notice that facial recognition was being used, nor were they informed of their rights to opt out or challenge the technology’s decisions.

This case highlights a fundamental misunderstanding. Placing a general “surveillance in use' sign in a store does not constitute adequate notice when facial recognition technology is involved. Facial recognition is not merely observation, like a static video camera. Biometric processing triggers heightened legal requirements. Contracts governing surveillance technology must account for this by including robust provisions for notice, oversight, and accountability. The absence of these provisions can result in federal enforcement, reputational damage, and operational restrictions, as is seen in the Rite Aid case.

4. Improperly Modifying EU Standard Contractual Clauses (SCCs)

Another persistent issue I have encountered in contracts over many years, particularly for multinational organizations, is the improper handling of EU Standard Contractual Clauses (SCCs). These clauses, issued by the European Commission, serve as a foundational legal mechanism for transferring personal data from the European Economic Area to countries that do not have an adequacy decision.

My work with SCCs began shortly after Directive 95/46/EC, the predecessor to the GDPR. Even in those early days, I observed how organizations misinterpreted the function of SCCs and frequently attempted to treat them as traditional, negotiable contract terms. The reality, then and now, is that SCCs are binding legal templates that must be adopted without substantive alteration.

The clauses are structured so that only certain fields and annexes can be completed or options selected by the contracting parties. These include identifying the data exporter and importer, specifying categories of personal data, describing the transfer purpose, listing technical and administrative safeguards, and selecting jurisdictions for the governing law and dispute resolution. However, the core obligations and legal language, including terms governing liability, processing, data subject rights, and third-party data management, must remain exactly as drafted by the European Commission. Any unauthorized modifications may render the SCCs invalid.

I clearly remember a meeting from earlier in my career when I explained this to a senior legal executive. This person was accustomed to negotiating any contract clause and took pride in doing so. When I told this person that the SCCs had to remain intact, their expression shifted from curiosity to disbelief. The concept of a contract with mandatory language, outside their control, challenged this person's entire approach to this kind of contract.

Despite the evolution of data protection law, the principle has remained unchanged. Standard Contractual Clauses (SCCs) must be treated with precision. In recent years, I have encountered organizations embedding SCCs into larger commercial contracts and surrounding them with terms that are contradictory. For instance, a data processing agreement might contain a provision stating that the data recipient may use personal data for "any business purpose," which directly conflicts with the SCCs' strict purpose limitation. These contradictions undermine the clauses' validity and expose the organization to regulatory scrutiny and potential legal action.

To ensure compliance, organizations must carefully complete the permitted sections of the SCCs, avoid modifying the core legal text, and review related contract language to eliminate any inconsistencies. SCCs are legal instruments designed to ensure uniform data protection standards across borders. Their rigidity is intentional, and their misuse carries serious consequences.

My long-standing experience with these clauses, from their earliest versions to the modern iterations, has taught me that respecting their structure is not just about compliance. It is about demonstrating a responsible and informed approach to cross-border data protection.

Resolving the Contract Conundrum

Data Privacy and Data Protection cannot be an afterthought in contracting. They must be proactive, embedded practices that align legal language with operational data use. This requires team training and a fundamental shift in how organizations perceive contracts, from static business tools to dynamic instruments of trust and compliance.

We must reframe contracts as living documents that evolve in response to changing legal norms, technological advancements, and shifting public expectations. This includes:

• Conducting periodic contract audits with a privacy lens

• Training staff to spot problematic language around data processing

• Vetting third parties for data protection management before executing agreements

Organizations that invest in updating their contract practices are better positioned to avoid litigation, mitigate risk, and build durable customer trust. Those that ignore the contract conundrum, however, are simply inviting trouble: legal, operational, and reputational.

Data privacy and Data Protection is not just a legal obligation. It is a business imperative. And contracts, often overlooked and misunderstood, are the first and best place to get it right and make Data Privacy a Business Advantage.

Do you need Data Privacy Advisory Services? Schedule a 15-minute meeting with Debbie Reynolds, The Data Diva.

Previous
Previous

An AI Data Privacy Cautionary Tale: Court-Ordered Data Retention Meets Privacy

Next
Next

Making a case for “Notice” in Privacy: Avoiding a basic Data Privacy Misstep