Microsoft Copilot Health Wants to Read Your Medical Records and Fitness Data

Microsoft Brings AI to Your Health Data
Microsoft has launched Copilot Health, a new AI-powered tool that consolidates your medical records, health history, and wearable fitness data into one place. The tool sits inside Microsoft’s Copilot AI assistant as a separate, secured space designed specifically for health information.
What Copilot Health Does
Copilot Health pulls together data from multiple sources to create what Microsoft calls “a coherent story” of your health. Here’s what it connects:
Wearable data: Activity, fitness, and sleep data from over 50 devices, including Apple Watch, Oura, and Fitbit. This means your steps, heart rate, sleep patterns, and workout data all flow into one AI-powered dashboard.
Medical records: Through a partnership with HealthEx, Copilot Health can access health records including visit summaries, medication details, and test results from more than 50,000 hospitals and provider organizations across the United States.
The AI then analyzes all this data to deliver personalized health insights, answer questions about your health history, and help you prepare for doctor visits.
Privacy Promises
Microsoft says Copilot Health conversations and data are isolated from general Copilot and protected with additional access controls. Data is encrypted at rest and in transit, and Microsoft claims it won’t use your health information to train its models. Users can manage and delete their information at any time.
Availability
Copilot Health opens its waitlist on Thursday, March 13. At launch, it’s limited to English-speaking adults in the United States, with expanded language support and additional countries to follow.
The Bottom Line
Let’s be real about what’s happening here: Microsoft wants to be the middleman between you and your most sensitive personal data. Yes, they promise encryption and isolation. Yes, they say they won’t train on your health data. But this is the same industry that routinely discovers “bugs” in data handling and updates privacy policies quarterly. Handing your medical records, sleep data, heart rate, and medication history to an AI chatbot is a massive trust ask — especially when the fine print says it’s “not intended to diagnose, treat or prevent diseases.” So what exactly is it for, then? A very expensive health diary that Microsoft gets to read?