Your medical records are about to become a lot more accessible—but at what cost? OpenAI has just launched ChatGPT Health in the US, a feature that promises to analyze your health data and provide personalized advice. Sounds revolutionary, right? But here’s where it gets controversial: privacy advocates are sounding the alarm. The tool encourages users to share not just their medical records but also data from apps like MyFitnessPal, Apple Health, and Peloton. While OpenAI claims this data will be stored separately and won’t be used to train its AI, the question remains: Can we trust that our most sensitive information won’t fall into the wrong hands?
OpenAI insists ChatGPT Health is designed to support, not replace, medical care, and that it won’t be used for diagnosis or treatment. But with over 230 million people already asking ChatGPT health-related questions weekly, the stakes are sky-high. In a blog post, the company highlighted enhanced privacy measures to protect sensitive data. Yet, Andrew Crawford from the Center for Democracy and Technology warns that health data is some of the most sensitive information people share, and safeguards must be airtight. Especially as OpenAI explores advertising as a business model, the separation between health data and other conversations must be foolproof.
And this is the part most people miss: Generative AI tools like ChatGPT can sometimes produce false or misleading information, often in a way that sounds incredibly convincing. Max Sinclair, founder of AI marketing platform Azoma, calls this a watershed moment—one that could reshape patient care and retail. But is it a step too far? With competition heating up from rivals like Google’s Gemini, OpenAI is positioning itself as a trusted medical adviser. Could this be a game-changer, or a risky gamble with our privacy?
For now, ChatGPT Health is only available to a small group of early users in the US, with plans to expand. It’s notably absent in regions like the UK, Switzerland, and the European Economic Area, where strict data protection rules apply. But in the US, Crawford warns, some firms not bound by these protections will be collecting and using health data. Without robust safeguards, sensitive health information could be at real risk.
So, here’s the big question: Is the convenience of personalized health advice worth the potential privacy trade-offs? Let us know what you think in the comments—this is a debate we all need to have.