The Health Internet

For whatever reason, people still do not get the basics of the Health Internet. Part of the problem is the fact that the marketing term has been, until recently the National Health Information Network or NHIN. The Feds recently decided to start calling the project the Health Internet, because that gives a much better idea of what they are trying to achieve.

Please do not be the guy/gal who writes in my comments but the Internet is not secure, that means my privacy will be violated. That is pure FUD and is not how the Health Internet will work. It is a relatively simple process to make the Health Internet into a zone that is more secure and private than the current health information infrastructure. Notice that did not say “secure” I said “more secure”. Your bank is not “secure”, your doctors paper records are not “secure”, the CIA is not “secure”. As an adjective, secure is more like the human attribute of “tall”. I am typically considered a tall person, but in college I was an student athletic trainer for my schools basketball team. In that crowd, I was short. While there is one and only one person who can be considered universally “tall”, it is well understood that this is a relative term. Similarly, the Health Internet is relatively more secure than current systems. I personally am far more comfortable having my private data in the Health Internet than I am with having my paper records locked in my Doctors office. You should be too.

So you should not be thinking about security or privacy in the Health Internet…. Really… It is as close to a solved problem as it gets. There are always obviously ways to make things more secure… but taller is not always better.

So what does the Health Internet buy you as an individual living in the US? To put it simply you and your Doctors should eventually be able to get to all of your health information as easily as you now get access to your financial information. Its a big promise, but the design of the Health Internet should eventually make that kind of convenience and access a reality.

Given that, it becomes obvious why a rebranding to the Health Internet is a good idea. For several basic reasons:

  • the original Internet started life as a government network (ARPANET)… And that has turned out pretty well.
  • the reason that the original Internet was such a hit was that people built neat stuff on top of it. Similarly, the Feds are hoping that people will use the Health Internet as the platform for further innovation.

So the Health Internet is a good thing and everyone should embrace it.

So how do you jump start a Health Internet? You do it by providing Open Source Software that enables people to participate in the new network.

Most people do not really understand the relationship between Open Source networking projects and the success of the original Internet. Here is how this breaks down:

Most of the Internet servers that provide X do it using Open Source project Y. With that as a template, look at the following chart:

Email:sendmail
DNS:bind
Web server:Apache

Of course, you -can- use proprietary software for these components, but the Internet as we know it would not exist without these very low cost tools that provide a substancially large portion of our Internet infrastructure. So whats the plan for the Health Internet? Simple.

Health Internet:CONNECT

The CONNECT project is an Open Source project that -will- run the core Health Internet. The core will connect major government health data sources including the VA and the DoD the initial Health Internet core. Most importantly the CONNECT software is available for local exchanges to connect into the core Health Internet.

Overall the strategy of creating an Open Source project that can be used to fractally to create other, connected, networks is a proven strategy. Its a smart move and it is going to change Health Informatics in fashion that is very similar to the way the Internet has already changed computing generally.

Surescripts agrees to modify NDA to be compatible with Open Source licences

As many of you know,  I am often asked to represent the FOSS Health IT community in negotiations with various organizations. My first opportunity to do this was with CCHIT, that negotiation has turned out pretty well. Then I represented the FOSS community at the NCVHS hearings on Meaningful Use.

Most recently, I have had requests from the community regarding Surescripts (who appropriately use a .net domain name… because they are a network!!).

For those that do not know, Surecripts (after the merger with RxHub) is essentially the only way to communicate electronic prescription messages in the United States. However, many in the FOSS community felt that the Surescripts Non-Disclosure Agreement prevented FOSS implementations of the Surescripts interface.

I just got off the phone with Rick Ratliff and Paul Uhrig from Surescripts and they agreed to modify the NDA to explicitly allow the release of Surescripts implementations under Open Source and Freedom Respecting Software Licenses. In fact from their perscpective this was implicitly allowed under the current NDA.

To move forward I have asked representatives from Medsphere and ClearHealth (Open Source vendors who already have a working relationship with Surescripts) to work with Surescripts to produce a short modification to the Surescripts NDA which will explicitly allow for a FOSS release. Once they have finished that language, we will present the resulting changes to the community at large to make sure it works for everyone. After this, Surescripts has agreed to add the changes to the default NDA.

While this issue will not be resolved until we have FOSS-available implementations that can access the Surescripts network, this is a huge step forward. I would like to thank Paul and Rick for making time for me in what must be a tremendously busy schedule.

Regards,

-FT

Network Effect vs Open Source

Something I have been thinking alot about lately is the issue of Software as a Service and how that model works with the network effect and open source software.

My thinking is prompted by a service that I am thinking of launching. The code behind the service is very simple, and while I have predilection to release everything I do under FOSS licenses, I am thinking of not releasing the code for this. Notice that I am not talking about making a proprietary software product, that would be unethical, I am talking about offering a service over the Internet, using code that is kept private. Private code is ethical, proprietary code is not. It is a matter of control, proprietary software allows a user to run software that they have no control over. Private software running a network service is often called the ASP loophole of freedom respecting software licenses like the GPL (but not the AGPL), but basically it is ethical because the user is not actually running the software at all, they are just accepting a benefit from that software. The moral issue gets convoluted when you have a service that maintains user data on the foreign site, rather than just providing a take-it-or-leave it service. Google, for instance, is in a very different position of responsibility when it chooses to offer an email service rather than a search service. If Google stopped providing search, that would suck, but if gmail went down and took years of my corresponence with it… that would -really- suck.

For certain kinds of critical data, I think it is unethical even to use private code. This should seem especially obvious for health information.

Before we get to my issue, I wanted to point out another organization that is in essentially the same position: StackOverFlow.com

StakOverflow is a site that supports the ability to ask very specific technical questions and then rank the answers that result. You see if StackOverflow releases its code open source, then you could have hundreds of separate question-answering sites start, all of which would have have only trivial amounts of users on each site: Joel (as in Joel on Software) discusses this issue in a podcast (transcript):

Spolsky: Well, but they will suck away some the audience that might have come to us, thus reducing the network effect, and thus reducing the value to the entire community.

As long as Stackoverflow is in -one- place, then all of its users go to one place to ask and answer questions. There is a network effect of all of those users going to to same location, it means more questions and more answers. More questions and more answers means better answers and questions since the whole point of the stack overflow architecure is that “more” becomes “better” through user voting. Better answers means that more people will go there to search, which means more users, which means more answers/questions which means better answers which means better users and you have the loop. The critical upward spiral of community collaboration where the more users you have the more valuable the central resource is.

What does this sound like? It sounds like open source software development and the way Wikipedia works. In fact there is a whole book about this upward spiral-through-open-collaboration effect called wikinomics.

But the upward spiral of the -content- on StackOverflow is hindred by attempting to open source the code. The code would obviously improve if it where open sourced, but the content would degrade. (Aside: It might be possible to find a way to turn the StackOverflow model into a protocol too, so that you could have multiple instances that would create a large disstributed system of StackOverflow instances. So that when you searched for bird watching on StackOverflow.com you might get results from BirdOverflow.com or whatever. This is what Google is trying to do with Google Wave)

It should be noted that StackOverflow actually already open sources the content that it produces, using a creative commons license for the questions and answers posted there. They also provide a data dump of the content, so that you can get it for programmatic use without bothering to screen scrape. So they really are making an open source contribution.

Back to my idea. I have a service that I will be launching soon that will also greatly benifit from the network effect on the content, but would be damaged by having multiple instances. I am inclined to not release the source code for this reason, but I have not yet made up my mind…

Update:

This got several good comments very quickly. Thanks for that, I really have not made up my mind on this issue and your comments have been very helpful.

Probably the most important information that I got is that there are several Open Source Stack Overflow clones in various stages of development.

I had searched for Open Source implementation of Stack Overflow and had only found Stacked. Personally reimplementing something so that it will not be proprietary anymore and then using a proprietary language (no offense to mono) to do it in just seems pointless. Of course I really wish there was something in php, since that is my current crutch language of choice. Hopefully people looking for a GPL or BSD implementation of Stack Overflow might be able to find it now. Drop a comment if you have a good implementation in php!!

-FT

e-prescribing prior art

Hi,
Whenever I hear that someone was doing Health IT a long-long time ago, I always suggest that they find copies of their old code and post them online so that we can have a strong source of prior-art to fight software patents with.

Recently, Bob Paddock took me seriously and dug up some invaluable prior art on automated prescribing. Today he sent me the results, including scans of printouts of both the printed prescriptions and the source code that made them. All of it with a date so long ago that it would invalidate any still active patent covering those subjects.

Bravo,

-FT

The iphone, a poor HIT platform analogy

Recently, a NEJM perspective article titled No Small Change for the Health Information Economy advocates that a Health IT platform should be created in imitation of some of the successful technology platforms in other areas. Specifically the iphone was mentioned. The relevant paragraph:

The Apple iPhone, for example, uses a software platform with a published interface that allows software developers outside Apple to create applications; there are now nearly 10,000 applications that consumers can download and use with the common phone interface. The platform separates the system from the functionality provided by the applications. And the applications are substitutable: a consumer can download a calendar reminder system, reject it, and then download another one. The consumer is committed to the platform, but the applications compete on value and cost.

The whole article is worth a read, there are some pretty invaluable fundamental insights that are provided here that are right on. However, there are problems with the iphone app universe. Imitating that universe will require that those problems find new solutions. The NEJM article recognizes some of these implicit difficulties, and suggests that the solution is for the government to step in and evaluate individual applications.

Here is a quick list of things that are true about the iphone that really should not be true in a HIT platform:

  • Apple plays favorites. Alot. Google was given special access to forbidden APIs for a voice application. Nike is another great example of company that created an application that has special privileges. It gets to have device integration that no one else gets. From Apples perspective these kind of things are acceptable because they create a user experience that is excellent. But it is not fair to developers. Developers who do not have the clout of Google or Nike know that they might be blown out of the water by a special deal that Apple might make with a bigger partner. It creates risk to developers and alot of resentment. Playing favorites gives Apple a short term advantage but ultimimately prevents a true meritocracy from developing. A Health IT plaform has to be truly open, and not play favorites.
  • Apple protects its cash cows at the expense of innovation. Google Voice could have broken the back of the AT&T price for SMS messages, which can cost about $5000 per megabyte . It was rejected by Apple because it hit them in the cashflow. But Google Voice is probably one of the most fundamentally innovative technologies to appear in a long time. A Health IT platform will need to find a way for this kind of blatant incentive problem from occuring. Its harder then you might think.
  • Apples approval process is inscrutable. Sometimes applications are rejected for content, even though that content is already available through Apple elsewhere. The approval process is slow, painful and does not make sense most importantly people hate it. The problem is that you have to have an approval process, and the reason that Apple is so closed about the process, probably has to do with the unpleasantness of watching sausage get made. It is not trivial to have an approval process that is fair and open, while also ensuring that developers do not abuse users. It takes time, which means money and it is not clear where that money will come from in a Health IT platform.
  • Apple is a locked-in provider of software. This can easily be fixed by jail-breaking your iphone, so that you can easily download apps from other sources. Apple limits the source of downloads for a reason, you can download anything with an jail-broken iphone… even things that will make your iphone much less stable. How do you ensure that applications are trustworthy, without having an exclusive source? Tough one.
  • Apple forbids creating applications that replicate core functionality. Which is exactly the opposite of what you want to do with a Health IT application. But no one will use the system unless you provide high-quality initial applications.

So is the iphone system, as an ideal, is fine to emulate. But you can see where your problems might be with such a platform by looking carefully at the problems that Apple is dealing with.

This is not really a criticism of the authors of the NEJM article… who, for instance, already see that the platform needs to be open source, addressing many of the problems that Apple is having by default… this is just to point out that all is not well in Apple land… analogies have their limits.

(update 9-01-09 I should talk about the iphone more often, this article has generated more comments, faster than anything I have written in years. One comment particularly stands out. Piyush Daiya over at androidmedapps.com has provided a very careful analysis that shows that Andriod is a better embodiment of the ten principles, that the NEJM authors endorsed. He is 100% right-on about that, and I wish I had thought to point that out myself. Thanks for reminding me, Piyush…)

-FT

Multiple Merged Monitors the nightmare

I use GNU/Linux on the desktop. This is for both ideological and practical reasons. I develop FOSS Health Software, and my preferred languages work best on GNU/Linux.

I also believe in Software Freedom.

Sometimes I will use proprietary software, if it is obvious that I need to do that in order to 1) further the overall FOSS in Healthcare movement or 2) put food on the table.

It was with some reservation that I have used the nvidia proprietary drivers in order to get three monitors working. I have used Red Hat for years and I usually like to use the latest version of Fedora for my desktop. However, I do not like upgrading as often as Fedora releases. I often get two or three versions behind. I had three monitors working as a single merged desktop on Fedora 9. Almost immediately after Fedora 9 went unsupported, (1 month after two higher versions existed with Fedora 11) an yum update of the livna rpms crashed my desktop.

Since then I have been scrambling to get a working multiple monitor, with merged desktop working.

This has been a painful, brutal process. I have tried two generations of 4 different major distros. I have bought and entirely new computer. In the end I ordered two huge Dell monitors because I could only get two monitors to work at one time.

I will spare you the minuta of what did and did not work. Here is what I discovered during my three month ordeal:

  • Multiple Monitor, merged desktop, and Multi-head support is one of GNU/Linux’s greatest weaknesses.
  • Multiple Monitor almost never works out of the box.
  • Debugging Multiple Merged Monitors is a nightmare.
  • Searching for solutions is painful. Almost infinite software and hardware version differences makes what you find almost always useless.
  • While x.org is advancing, it is a very poorly managed project.
  • Nvidia is making development so painful with proprietary licenses that they should be boycotted.
  • By using nvidia’s drivers, rather than participating in efforts to replace them I have been making the problem worse.

I should know better. If I can help it, I will never use a proprietary GNU/Linux module again.

-FT

DocOliver

I have convinced my good friend and partner in healthcare reform, Dr. Cari Oliver, to start a blog.

She will now be talking about her ideas regarding patient engagement/empowerment/involvement/safety at docoliver.com

Whenever someone asks me “how do you do it” for blogging, I always tell that the secret is to have something to say about something specific.

For instance, here on FredTrotter.com you will get a steady stream of information about Open Source Software in Healthcare, and all things related to it. Thats a pretty broad brush that lets me talk about politics, healthcare, and healthcare IT along with Open Source in Healthcare. My readers know that if they visit or subscribe to my feed, they will generally get information about what is going on in the FOSS Healthcare world, with a generous dose of helpful bias.

At DocOliver.com you can expect to hear about how a patient -should- engage in their own healthcare, and -how- they can use Health Information tools to do it. If I had written the tagline of Doctor Olivers site, it would be “Making PHRs actually do something useful for you”, but she tends be a little more disciplined and careful with her prose than I am.

-FT

Rackspace instead of Amazon

For now, Health IT related projects should use the Rackspace Cloud instead of the Amazon Cloud.

Some of us are concerned with the issue of Software Freedom. Essentially, you need to have control over what your computer is doing and unless you have software freedom, someone else (the copyright holder who has given you a proprietary license) is in control with proprietary software. Software that respects the freedom of its users, often called ‘Open Source’ software, should be used exclusively in the healthcare domain. This should be obvious if you think about it. It is unethical for clinicians to allow proprietary vendors to control their computers, because they should have custodianship of patient records. If you agree with this paragraph, you really need to join Liberty Health Software Foundation.

The difference between the ‘cloud’ and ‘virtualization’ technologies with regards to GNU/Linux instances is simple. It is simply a manner of having a structured API available for the provisioning and control of GNU/Linux instances.

It is possible to implement a “cloud” in your local data center using projects like Eucalyptus which essentially allows a large computer or set of computers to act like Amazons ec2 service.

Is the API that is used to deploy these clouds FOSS compatible or not? If they are not FOSS, then they can become a mechanism for proprietary lock-in of health information. It does not matter if you avoid lock-in by using an entirely FOSS stack if you host it at Amazon and you cannot leave that service easily.

Remember, that we need to be concerned with  the continuity of Health data for hundreds of years, which is a totally different perspective than most IT applications. You need to be looking forward to the day that Amazon shuts its doors. That day -will- come, and you (or your successors) need to be able to get instance out of that cloud easily. In the short term, having access to cloud API’s under FOSS licenses, helps address the basic concerns that people who respect software freedom have about the whole idea of cloud computing.

Others have discussed this before, but I want to point out that for the time being, if you want to safe from all proprietary nonsense in your health information application, you should be using Rackspace, since Rackspace has provided its API to the community under an open source license. That makes the Open Source Rackspace API a new option for those who, like me,  believe that software freedom is even more critical in healthcare applications.

I hope that Amazon will soon release its API under a FOSS license, but until it does… use Rackspace.

-FT

(updated 08-10-09 added ‘remember’ paragraph for clarity.)

Securing health applications with CACert.org

Still trying to recover from the conference last weekend.

OpenEMR was out in force at the conference and we had some interesting discussions about the best way to make php applications more secure. The following code is in php but the theory applies to any electronic health record. The wonderful thing about this method is that Apache does all of the heavy lifting for you.


Of course, none of this works without an apache configuration!!



# another fine way to enforce https only.

        ServerName example.com:80
        AddType application/x-httpd-php .php .phtml .php3
        DocumentRoot "/var/www/html/example/"

        
        #The following rewrite just forces everything to https!!!
        RewriteEngine On
        RewriteCond %{HTTPS} off
        RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}
        




        ServerName example.com:443
        DocumentRoot /var/www/html/example

        # Standard stuff
        ErrorLog logs/ssl_error_log
        TransferLog logs/ssl_access_log
        LogLevel warn
        SSLEngine on
        SSLProtocol all -SSLv2
        SSLCipherSuite ALL:!ADH:!EXPORT:!SSLv2:RC4+RSA:+HIGH:+MEDIUM:+LOW
        SSLOptions +StdEnvVars
        SetEnvIf User-Agent ".*MSIE.*" \
                nokeepalive ssl-unclean-shutdown \
                downgrade-1.0 force-response-1.0
        CustomLog logs/ssl_request_log \
                "%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"

	# end standard stuff

 
	# the certificate that CACert.org has signed...
        SSLCertificateFile /etc/pki/tls/certs/example.com.crt
	# my super secret private key
        SSLCertificateKeyFile /etc/pki/tls/private/example.com.key

	# not that I can use the directory command to protect a single file!!
        
                # requries a client certificate
                SSLVerifyClient require
                SSLVerifyDepth 2
                # in order to validate the client certificates I need to have 
                # a copy of the CAcert.org root certificate
                SSLCACertificateFile /etc/pki/tls/certs/cacert.crt
                SSLOptions +StdEnvVars
        
                                                                                                                                                                                   1,9           Top