Anthropic plans to utilize your dialogues to propel the performance of Claude, but you have the option to opt-out if you so choose.
Anthropic, the US-based AI startup behind the Claude model, has announced a change in its data policy for regular users. Starting from an unspecified date, the company will begin using user chats and coding sessions for model training unless users actively opt out.
By default, the training permission toggle is switched on, potentially leading many users to unintentionally agree to five years of data retention. However, users can opt-out of this data collection at any time.
Under the new policy, opt-out chats will be stored for only five years, unlike previous policy violations which might have been stored for up to two years. The change aims to improve Claude's safety and capabilities in areas like coding, analysis, and reasoning.
The change in data policy is driven by the industry-wide demand for fresh, real-world data to make AI more capable, accurate, and competitive. Enterprise customers are exempt from these changes, similar to OpenAI's corporate clients.
The rollout of the new data policy has raised concerns due to the prominence of the "Accept" button for new terms and the smaller, less noticeable toggle for training permission. Users are encouraged to review the terms and make an informed decision about their data usage.
The new policy applies to users of Claude Free, Pro, and Max. If users do not opt out by the deadline, the system will enable the use of their data by default. The deadline for opting out has not been confirmed yet, but users are advised to check Anthropic's website for updates.
Anthropic's blog post states that the data policy change is part of its commitment to building smarter, safer AI. The speed and subtlety of this shift in data policy demonstrate how quickly user expectations around privacy are evolving.
Until now, Anthropic had a policy of deleting prompts and responses after 30 days, with exceptions for policy violations. The new policy represents a significant departure from this previous stance.
Users who wish to opt-out of data collection can do so by visiting Anthropic's website and following the instructions provided. It is important for users to understand the implications of this change and make an informed decision about their data usage.
Read also:
- Peptide YY (PYY): Exploring its Role in Appetite Suppression, Intestinal Health, and Cognitive Links
- Toddler Health: Rotavirus Signs, Origins, and Potential Complications
- Digestive issues and heart discomfort: Root causes and associated health conditions
- House Infernos: Deadly Hazards Surpassing the Flames