ViqusViqus
Navigate
Company
Blog
About Us
Contact
System Status
Enter Viqus Hub

Anthropic Shifts to User Data Training, Raises Privacy Concerns

AI Anthropic Data Privacy Claude User Data Terms of Service AI Training
August 28, 2025
Viqus Verdict Logo Viqus Verdict Logo 8
Data Drift
Media Hype 6/10
Real Impact 8/10

Article Summary

Anthropic is implementing a significant change in its approach to AI model training, moving from primarily relying on publicly available data to incorporating user-generated content, specifically new chat transcripts and coding sessions. This shift, effective September 28, 2025, applies to all tiers of Claude, including the free version. However, users must actively opt-out, as the default setting is to allow Anthropic to use their data for model improvement and training. Data retention is extended to five years for those who don’t choose to decline, raising significant privacy concerns. The update impacts all Claude subscriptions, excluding commercial tiers. Users can modify their preference during signup or through privacy settings, but past data already used for training remains part of the system’s knowledge base. Anthropic asserts it employs data filtering and obfuscation techniques to protect user privacy and doesn’t sell user data to third parties.

Key Points

  • Anthropic is transitioning to train its AI models on user chat transcripts and coding sessions.
  • Users must actively opt-out if they don’t want their data used for training, with the default setting being ‘on’.
  • Data retention policies will extend to five years for non-opting-out subscribers, a significant change in privacy control.

Why It Matters

This news is critical for professionals and consumers alike. The shift towards user-generated data fundamentally changes the nature of AI development and raises serious questions about data privacy and algorithmic bias. As AI models become increasingly integrated into daily life, understanding how they are trained – and whose data is contributing to their intelligence – is paramount. The potential for unintended biases and the long-term implications of data retention require careful scrutiny and proactive engagement from both developers and users.

You might also be interested in