OI
Open Influence Assistant
×
Anthropic Starts Training Claude on User Chats
Anthropic Starts Training Claude on User Chats

Your private conversations with Claude may be used to improve future versions of the chatbot. Anthropic announced on August 28, 2025 that it will update its Consumer Terms and Privacy Policy to begin using user chat transcripts for model training. This affects users on Free Pro and Max plans and creates a 30 day window to opt out.

What is changing

Key points to know about the Anthropic data retention policy update and how it affects you:

  • Training data usage: Starting Sept 28, 2025 user chat transcripts may be included in training data for Claude models.
  • Opt out deadline: Users must act by Sept 28, 2025 to opt out of Claude model training.
  • Retention: Anthropic states conversations may be retained for up to 24 months for training purposes.
  • PII removal: The company says it will attempt to remove personally identifiable information before using chats for training though details remain limited.

How to opt out of Claude AI model training

Follow these steps to stop your chats from being used for model training:

  1. Open your account settings in Claude.
  2. Locate the privacy controls or privacy settings 2025 section.
  3. Disable the option labeled Use my data for model training or similar wording.
  4. Confirm your choice and check any email confirmation or account status page for confirmation.

Multiple tech outlets report the setting is enabled by default so users who take no action will have their chats included in training data. If you are asking how do I stop Claude from using my chats this is the place to start.

Why this matters for privacy and businesses

Real user conversations provide high quality context and tone for training but they can also contain sensitive material. For individuals this raises AI privacy concerns about personal data in chat history. For companies the risk includes sharing proprietary information or internal strategies during conversations that could influence future model behavior.

Practical tips

  • If you use Claude for work avoid discussing proprietary information until you confirm your company policy.
  • Consider deleting past conversations you do not want included though deletion may not affect already retained data under the retention policy.
  • Monitor Anthropic updates and the trust and policy pages for changes to the opt out process and retention rules.

FAQ

Can I opt out after Sept 28, 2025?

Anthropic set Sept 28, 2025 as the date when training with user chats begins. Opt out instructions apply until that date. After that you should review the company pages for any changes to opt out options.

Will disabling data sharing delete my past conversations?

Disabling the model training option should stop future chats from being used but may not delete data already retained. Check the privacy pages and follow the delete conversation process if available.

This update marks a significant moment in the balance between model improvement and user control. If you value keeping your conversations private take action in your Claude privacy settings 2025 before Sept 28, 2025 to prevent your chats from being used for model training.

selected projects
selected projects
selected projects
Unlock new opportunities and drive innovation with our expert solutions. Whether you're looking to enhance your digital presence
Ready to live more and work less?
Home Image
Home Image
Home Image
Home Image