Anthropic updates Claude with a user data opt-out option and a five-year retention policy. Users on Free, Pro, & Max plans can decide if chats aid AI training.
Published on
5 min read

Anthropic Rolls Out Policy Updates for Claude Free, Pro, and Max Users

Anthropic is making major changes to the way it handles user information. According to Anthropic, in a recent update, Claude will now give people a new decision: share their conversations to improve AI systems or decline participation from September 28, 2025. This move has drawn attention because it places user choice at the center of how the company develops its technology. At the heart of this change is the Anthropic user data opt-out option, which lets people decide if their chats will be included in future training.

The updates affect users on Claude Free, Pro, and Max plans, including when they access Claude Code through those accounts. They do not cover services governed by the Commercial Terms, such as Claude for Work, Claude Gov, Claude for Education, or API usage through third parties like Amazon Bedrock and Google Cloud’s Vertex AI.

Five-Year Retention Policy Raises Questions

Anthropic’s update states that data retention is five years if opted in. The most talked-about aspect of Anthropic’s privacy policy update is the length of time data will be stored. The company has stated that information may be retained for up to five years, commonly referred to as the Anthropic data retention five-year policy. For many, this raises questions about how long personal conversations should reasonably be held. While Anthropic says the purpose is to improve transparency and security, not everyone is comfortable with such a long timeline. In June 2025, Anthropic customized an AI model for the U.S. security forces, named ‘Claude Gov’, to help with special tasks like intelligence analysis, strategic planning and daily operations.

User Consent Comes to the Fore

Another critical part of this update is how consent is being handled. Anthropic’s user consent for AI training model makes it clear that users must actively agree if they want their data included in training efforts. This shift is being seen as a way of putting more control back in the hands of users, a step that many privacy advocates have been calling for across the tech industry. Still, some argue that not all users will fully understand what they agree to, especially with complex terms of service.

New Consumer Terms Introduced

Anthropic has also released new consumer terms, which outline in greater detail how data is used, why it is collected, and what rights users have when choosing to opt in or out. These terms are designed to be more transparent, but critics point out that legal language often remains hard for everyday users to follow. The challenge for Anthropic is making sure people can make informed decisions without being overwhelmed by lengthy documents.

Balancing Innovation with Privacy

For Anthropic, these changes represent an attempt to balance two competing needs: advancing AI models and respecting individual privacy. The Anthropic user data opt-out feature is clearly a response to growing public concern about how personal information is handled in the AI industry. With the addition of Anthropic’s privacy policy update, the data retention five-year clause, and the focus on user consent for AI training, the company is signaling that it wants to show greater accountability. In June, Anthropic won the copyright lawsuit when a U.S. judge ruled that the company’s use of books to train an AI model is legal.

As the new Anthropic consumer terms roll out, the debate will likely continue. Supporters believe that giving users an opt-out option is a positive step, while critics remain cautious about how long data is stored and how clearly consent is explained. For users, the key takeaway is simple: they now have more power than before to decide how their data is used.

Linda Hadley
X

Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as Necessary are stored on your browser as they are essential for enabling the ... Show More

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as Necessary are stored on your browser as they are essential for enabling the basic functionalities of the site.

We also use third-party cookies that help us analyze how you use this website, store your preferences, and provide the content and advertisements that are relevant to you. These cookies will only be stored in your browser with your prior consent.

You can choose to enable or disable some or all of these cookies but disabling some of them may affect your browsing experience.

Show Less

Necessary Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

Functional

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No Cookie to display

Analytics

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

Performance

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No Cookie to display

Advertisement

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No Cookie to display
Scroll to Top