“Lawsuit Claims Photobucket Registered Inactive Users in Disputed Privacy Procedures”

"Lawsuit Claims Photobucket Registered Inactive Users in Disputed Privacy Procedures"

“Lawsuit Claims Photobucket Registered Inactive Users in Disputed Privacy Procedures”


### Class Action Lawsuit Challenges Photobucket’s Strategy to Monetize User Photos for AI Training

Photobucket, formerly a leading photo-sharing service in the MySpace era, is currently facing a class action lawsuit that threatens to hinder its contentious strategy to sell user-uploaded photos—including sensitive biometric data—to firms training generative AI models. The lawsuit, initiated on December 11, 2024, claims that Photobucket infringed on privacy regulations by not securing explicit user consent prior to monetizing their images.

This legal confrontation highlights increasing worries regarding the exploitation of personal data, especially biometric information, in the realm of artificial intelligence. With potentially 100 million users affected, this case could establish a vital precedent for data privacy and the responsible use of AI.

### **The Claims Against Photobucket**

The lawsuit revolves around Photobucket’s recent update to its privacy policy, which disclosed intentions to license user photos—including facial and iris scans—to AI companies. Plaintiffs contend that this action breaches strict privacy laws in states like Illinois, California, and New York, which mandate that companies must obtain written consent before collecting or selling biometric data.

Key claims include:

1. **Unauthorized Sale of Biometric Data**: Photobucket purportedly sold biometric data without user approval, infringing on laws such as Illinois’ Biometric Information Privacy Act (BIPA), recognized as one of the most rigorous biometric privacy laws in the U.S.

2. **Deceptive Communication**: Plaintiffs accuse Photobucket of employing misleading emails to pressure inactive users into accepting revised terms of service. These emails, characterized as attempts to “protect” user data, reportedly coerced users into consenting to the new Biometric Information Privacy Policy—even if they simply wished to delete their accounts or download their photos.

3. **Automatic Enrollment**: The lawsuit asserts that users who overlooked Photobucket’s emails were automatically enrolled in the new policy after 45 days, further aggravating the alleged violations.

4. **Effects on Non-Users**: The case also emphasizes the situation of individuals who never registered for Photobucket but are included in photos submitted by others. Their biometric data could have been sold without their awareness or approval, potentially widening the lawsuit’s scope.

### **Potential Repercussions for Photobucket**

Should the court determine that Photobucket breached privacy laws, the financial consequences could be immense. Plaintiffs are pursuing punitive damages of up to $5,000 for each “willful or reckless violation” of biometric privacy laws. With over 13 billion images in Photobucket’s database—approximately half of which are reportedly public and available for AI licensing—the penalties could rapidly accumulate into billions of dollars.

The lawsuit also aims to:

– **Cease Photobucket’s Data Sales**: Plaintiffs are seeking an injunction to prevent the company from selling or licensing user data without appropriate consent.
– **Compensate Affected Users**: Plaintiffs call for Photobucket to repay unlawfully acquired profits and reimburse users for the unauthorized use of their data.
– **Identify AI Companies**: The lawsuit seeks to reveal the identities of AI companies that acquired the data, which could result in further legal challenges under state privacy regulations.

### **Wider Implications for AI and Privacy**

The case against Photobucket is part of a broader discussion regarding the ethical treatment of personal data in AI development. Generative AI models, such as those utilized for facial recognition or image synthesis, necessitate vast datasets for effective training. However, using personal photos without consent raises critical ethical and legal considerations.

#### **Deepfake Concerns**
One concern among plaintiffs is that AI models trained on Photobucket images could facilitate the production of realistic “deepfakes” or inadvertently reproduce user photos. This could lead to identity theft, fraud, or other forms of misappropriation.

#### **Data Transparency**
The lawsuit also emphasizes the necessity for improved transparency regarding how companies manage user data. State privacy regulations often require firms to disclose the duration for which biometric data will be stored and its intended use. Plaintiffs argue that neither Photobucket nor the AI companies purchasing the data have adhered to these stipulations.

### **Photobucket’s Reaction and Future Outlook**

Photobucket has not yet publicly addressed the lawsuit, but CEO Ted Leonard previously acknowledged the company’s intention to license images for AI training. In an October 2024 interview with *Business Insider*, Leonard characterized the initiative as a method to generate “significant” revenue to reinvest in the platform. However, he did not provide specific information regarding the agreements or the companies involved.

Legal experts indicate that Photobucket’s defense may rely on whether its updated terms of service can be deemed a legitimate form of user consent. Nonetheless, plaintiffs argue that coercive measures and automatic opt-ins negate any assertion of informed consent.

### **What’s Next?**

Photobucket has roughly 30 days to reply to the complaint.