Responding to Sweeney

I am again discussing the privacy comments from Dr. Latanya Sweeney. She testified to Congress that both the NHIN CONNECT and NHIN Direct security models where flawed.

Figure 2(b) summarizes concerns about these two designs. The NHIN Limited Production Exchange has serious privacy issues but more utility than NHIN Direct. On the other hand, NHIN Direct has fewer privacy issues, but insufficient utility. When combined, we realize the least of each design, providing an NHIN with limited utility and privacy concerns.

You mean both projects are un-private?

I have recently posted about the assumption that NHIN Direct is less functional than NHIN CONNECT. So now I want to talk only about the privacy failings that Dr. Sweeney implies.

I summarize the statement above and her testimony generally to mean that Dr. Sweeney believes that “NHIN CONNECT and NHIN Direct both fail to protect privacy, but NHIN Direct is the lesser of two evils”.   That is a meaningless statement. It is easier to see this if you speak in terms of known Open Source applications. For instance if I say “Apache does not support privacy” or “Firefox does not support privacy” it becomes pretty clear that the point is hog-wash.  I can use Apache to setup a website that only I and my family can access. I can also use Apache to create a website that will abuse users privacy the same way that facebook does. Similarly Firefox can be configured to behave in a more private, and less private way through settings.

Moreover both projects can be used as a software platform that can allow other programs to increase or decrease privacy. mod_ssl is a perfect example with Apache, and this is even more apparent with Firefox, which already has tools that help publish browsing habits, and also has several tools to make browsing more private.

As with any software project and protocol, it is possible that there are privacy implications in the underlying protocols and there is also the possibility that through mis-implementation either Open Source project could create a security or privacy risk where none exists in a correct implementation of the protocol. There is nothing to be done about this. This is a problem with software generally, and the best we can do is to put both the protocols and the implementations of those protocols out in the open so that security researchers can look for flaws. For both NHIN CONNECT and NHIN Direct this has already been done or is happening now.

Further more, the underlying protocols for both NHIN Direct and CONNECT are designed to allow for different kinds of privacy policy and enforcement. From a configuration point of view, both projects will be able to support extensive consumer consent options for instance, or they could be configured to generally ignore consumer consent. For that matter they could both be configured not to share any information at all, or to only accept in-bound data and not send data out.

From a merely rational point of view, most doctors in the US have email accounts but rarely use them to send PHI. When they do send PHI, it is usually legal and privacy respecting. This is not always true, but there is nothing that we can do with “email” to make it more true. It is not about the technology but how you use it.

Design Patterns are a Straw Man

Dr. Sweeney suggests that “For example, a domestic violence stalker can use the system to locate victims.”. But each node on the current and future NHIN Exchange will have the ability to monitor for strange searches and should be able to easily detect if a user frequently searches for 20-25 year old women who live near them. Are those policies and procedures enough? Hard to say since I have no idea what they are. Latanya does not indicate what they are either, but still makes this assertion.

Generally, assertions of privacy violations without evidence seems to be modus operandi for the patient privacy rights team and Latanya seems to be holding to form. Ironically, the new NHIN Exchange should allow for detection of the kind of abuse that Dr. Peel and Dr. Sweeney assert is common. While the current fax based system would allow them to go undetected. Dr. Sweeney gives several “for examples’

  • for example, a domestic violence stalker can use the system to locate victims.
  • for example, an insider could receive notifications of all abortions performed at other organizations.

in both of these examples, if the “black hat” in question is currently monitoring all incoming faxes at a local planned parenthood headquarters, and has a pen and paper handy, he can get all of this information now… but there is no way to detect that. It would not have to be a betrayal by an insider either. A tap could be placed on the fax line, even from outside the building. Both of these attacks are undetectable today.  Ironically, the NHIN Exchange sometimes prevent these kinds of abuses and the rest of the time it would provide precisely the evidence for information leaking that both Dr. Peel and Dr. Sweeny assert is common.

Dr Sweeny asserts:

Corrections (see Figure 5). In the data sharing environments described so far, there is no mechanism for propagating corrections or updating patient information.

But then she also says:

In one version [10], event messaging allowed 3rd party notification of patient information outside the direct care of the patient and without the patient’s knowledge.

I do not want to straw man her position here, she is talking about two different theoretical designs as she makes these statements. But there is a tradeoff here. The actual standards (Section 1.3) implemented in NHIN Exchange specifically state that:

In addition to “Query/Retrieve” and “Push”, the NHIN must support a publish and subscribe information exchange pattern in order to enable a wide range of transaction types. HIEM defines a generic publish and subscribe mechanism that can be used to enable various use cases. Examples include….   Support for notification of the availability of new or updated data

So in the real world.. we do have a problem with “event messaging” potentially used as a means to violate privacy, but because of event messaging we do not have a problem with broadcasting corrections to patient data.  There is a tradeoff here. Exactly the kind of tradeoff that Dr. Sweeney says we do not need to make:

Figure 2(a) depicts the traditional false belief of trading privacy for utility. It also shows our 9-year experience in the Data Privacy Lab of finding sweet spots of solutions that give privacy and utility. The key to our success has been technology design.

I want to be clear. I agree that there are sweet spots of “acceptable privacy and acceptable utility”. In fact in this situation the “technology fix” is to do extensive logging and auditing on who is looking at what so that you can detect the abuse that a comprehensive alert system makes possible. I could talk about how you might do that, but again looking at the actual standards you find a comprehensive description (Section 5).

Dr Sweeney ends her testimony with the suggestion that:

Performing risk analyses on design patterns provides a clear, informative path.

But that is simply not true. In fact, the her own “start” to performing a risk analysis on design patterns serves only to devalue the work that has already been done on NHIN CONNECT and is now being done on NHIN Direct. Criticizing a design pattern not used in given software, and then implying that the given software also suffers from those problems is a straw man process.

Dr Peel has said to me that she feels that Dr. Sweeney is being ignored and I can see why now that I have carefully read her testimony. I would welcome specific criticisms of the security protocol designs that we will be using on NHIN Direct. But I would suggest that Dr. Sweeney make criticisms on either particular software or at least a specific “Design” rather than a “Design Pattern”. Currently Dr. Sweeney is over-simplifying NHIN CONNECT  by talking about “design patterns” which are not used. The first consensus draft of the  NHIN Direct security protocol design does not even exist as I am writing this post, while Dr. Sweeney’s testimony is already more than a month old. I fail to see how she could say anything legitimate about the NHIN Direct security and privacy design at all, one way or another.

I would suggest, as I have said to Dr. Peel in person not 48 hours ago, that if Dr. Peel, Dr. Sweeney and Patient Privacy Rights generally want to be included in the relevant discussions, that that they try and keep their discussions relevant.  You are literally criticizing what we are NOT doing, and then implying that ONC is not working with you. I know that your hearts are in the right place, but I cannot code what you describe.


2 Responses to “Responding to Sweeney”

  1. twitter

    Im sorry that people are beating around strawmen when there are much larger issues in PHI like the use of non free software. Non free software users have to trust a vendor to not violate their privacy, but most vendors promise to do just that. The Windows EULA, for example, gives Microsoft the ability to inspect all files on a users computer without notice. Since Vista, the system indexes user files and sends regular encrypted communications back to Microsoft. This is a system that no one should ever use with patient information. Other non free systems may not be as blatant but cant be trusted because the programs owners lock users out of operations.

    Your email analogy is apt. Free software email clients have GNU Privacy Guard that encrypts the message. If the protocols you are talking about are implemented in free software, they can be used with OpenSSH. PHI should be encrypted end to end and only looked at with privacy respecting software like Debian GNU/Linux. Networks should not be trusted.


Comments are closed.