
Microsoft wants to store your healthcare data so that its AI “delivers personalized health insights that you can act on,” but without the liability that comes with actual medical advice.
This biz has created a supposedly “separate, secure space within Copilot” to do so, under the name Copilot Health.
The company’s announcement buries the lede. At the end of its post comes the disclaimer: “Copilot Health is not intended to diagnose, treat, or prevent diseases or other conditions and is not a substitute for professional medical advice.”
That’s perhaps for the best in light of a recent UK study that found chatbots give poor medical advice.
Nonetheless, people commonly consult AI models for advice about their health. When OpenAI counted up potential customers, it found more than 40 million people worldwide asking ChatGPT for healthcare advice each day. Eager to tap into that market, OpenAI announced ChatGPT Health in January. Anthropic threw its hat into the ring a few days later with Claude for Healthcare.
Microsoft’s own research on how Copilot is used indicates that almost one in five conversations involves assessment of a personal symptom or condition.
In a social media post, Mustafa Suleyman, CEO of Microsoft AI, said, “I think people are still underestimating how profound this transformation is going to be. Today we’re announcing Copilot Health, enabling users to connect all their EHR records and wearable data in a secure, private health space that Copilot can analyze and reason about to provide personalized insights and proactive nudges.”
These personalized insights and proactive nudges are not medical advice though; they’re intended to promote something more nebulous – wellness. Suleyman suggests that Copilot Health will help people come up with focused questions to present to actual doctors during medical appointments.
Copilot Health is described as a way to help people organize activity data from consumer wearable devices such as Apple Watch, Oura, Fitbit, and others – information that can then be combined into a profile alongside hospital health records and lab results.
Per Microsoft’s disclaimer, this is not intended as medical advice. But it certainly sounds like that’s the goal – Suleyman says that Microsoft wants “to make this service available to the billions of people around the world who struggle to access reliable medical advice.”
But the distinction between regulated medical advice and best-effort AI emissions about health may become more difficult to discern, thanks to the US Food and Drug Administration’s relaxation of wearable rules at the start of the year. As law firm Arnold & Porter noted in January, “the revised policy concerning wearables likely means that more AI-enabled CDS [clinical decision support] can be made available as non-device CDS, i.e., without FDA review.”
Copilot Health comes with assurances about security and privacy, an area where Microsoft’s track record speaks for itself.
“Your Copilot Health conversations and data are isolated from general Copilot and kept under additional access, privacy, and safety controls,” insist Microsoft’s medical messengers Bay Gross, Peter Hames, Chris Kelly, Dominic King, and Harsha Nori.
“Data in Copilot Health is protected with industry leading safeguards, including encryption at rest and in transit, strict access controls, and the ability to manage and delete your information when you choose. You can disconnect your connectors to health data sources such as electronic health records or wearables instantaneously at any time. Your information in Copilot Health is not used for model training.” ®