Health records: turning patients into hackers

by Keeley Wray on December 13, 2010

Credit: D'Arcy Norman/Flickr

A pregnant woman wants to monitor her baby’s activity in response to the foods she eats. She takes her smart phone, plugs in an ultrasound adapter and takes readings after every meal. She logs the contents of her meal and presto! Pattern recognition software tells her that her baby is unfavorably sensitive to dairy. Through a personally controlled portal, she remotely loads the information to her hospital’s electronic medical record system, for review by an allergist at the time the baby is born.

A man with a diabetic foot ulcer takes home two different topical creams. He takes daily pictures of the ulcer, and using image processing algorithms, he learns which medication is promoting optimal healing. He sends this information to his personally controlled health care record (PCHR), where it is uploaded to a database monitored by researchers who are correlating genetic data to treatment responsiveness.

In the future, users may be able to process Electronic Health Records (EHR) data with their application of choice and return it to their permanent record via a personally controlled portal. They may be able to add new data to the record that they collect via mobile devices or other inputs.

Currently, this vision of the future may seem overly optimistic, given the inflexibility and proprietary formats of existing EHR databases. But if the database structure could be made into an open standard, and redesigned to receive new types of data, a consumer-focused approach to medical data management might really be possible.  Some software developers, such as Microsoft and Google, already offer personally controlled health records that aggregate data from multiple providers.

I have one recommendation for health data system developers: make the systems hackable! No, I’m not suggesting leaving the data platform open to malicious attacks.  I mean “hackable” in the sense of the word that is quickly growing in popularity: allow owners of a product to manipulate, re-purpose or add to the functionality of a product to serve their own personal needs.  A personally controlled health portal should be flexible enough for users to invent new uses, connections, filters, analyses and input devices.

Outside of health care, “hackable” products are growing in popularity as developers realize they may grab more market share — and maybe even new, useful ideas — by allowing users to open up and expose or add untapped functionality. Users gain more from the consumer experience when they are able to mold the technology to their individual tastes. For example, Microsoft deliberately left their new game console, Kinect, open to tinkerers. The same principle could be applied to personally controlled portals for health data: an article in Harvard Business Review touts novel data “mashups” as the next scientific revolution.

A myriad of technologies are already available to contribute to a data repository and to derive meaning from the data. Doesn’t it make sense to allow patients to put the technologies together in a way that meets their needs? Like a preschool teacher dumping out a bucket of toys for children eager to get their play on, we could offer patients a bucket of options: wearable or implantable vital sign monitoring devices, machine learning algorithms, calorie and mood diaries, BLAST, natural language processing algorithms, portable EEG headsets and ultrasound probes, lateral flow testing strips, image processing algorithms, epidemiology maps.

Beyond our bucket of tools, we should hope that patients can extend the functionality of their own portals and take data capture and control into their own hands. The patient is usually his own best advocate and has the most at stake. He is present to his experience in his own body 24-7, with access to information in a way no one else can reproduce. Empowered with information and tools, patients just might help solve their own health problems, saving everyone money, time and grief.

4 comments

  • Eric Neal

    Excellent article. I completely agree. Such products are quickly becoming a reality. Allowing people to “tinker” their own devices can become dangerous. Then there is the real fear others from outside may be able to tinker with the patients device for malicious purposes. Every new technology brings light and ease to people, but it seems if there is the light to be accepted, the darkness will also accompany that light.

  • Eric Neal

    Excellent article. I completely agree. Such products are quickly becoming a reality. Allowing people to “tinker” their own devices can become dangerous. Then there is the real fear others from outside may be able to tinker with the patients device for malicious purposes. Every new technology brings light and ease to people, but it seems if there is the light to be accepted, the darkness will also accompany that light.

  • Marja

    I like the idea. However, the patients should be educated to understand the benefits and dangers and be able to get information of the current research easily and understand if you want to be anonymous when you share the info.

    Maybe you find that your mild rash is related to cancer or STD or HIV or something else you are not ready to share with your name (and note that things can sometimes change when research advances). Maybe you would like to not share your genetic condition that leads to an early death with your children on social media.

    Furthermore, in this age you may get advertisements on TV related to your condition because google has already figured it out. So when you watch Celtics or Patriots game with your friends do you want to start getting advertisements of medications for your suspected condition as your friends may pick up the pattern?

    That said, a lot of misery can be helped by getting information, understanding what you have, finding and trying out what works with others. The patients spend much more time with themselves than the doctors do and they also know other things about themselves that often may be related but do not raise flags with doctors maybe because the research has not make the connection yet or the doctors in one area do not know about the research on another area.

  • Keeley Wray

    Thank you for the thoughtful comment, Marja. I agree security of data needs to be a priority for any new platform. Or, at the very least, patients should be informed of when and when not their information is made public and what the consequences might be of sharing that information. Fortunately, patient social networks are already paving the way to develop best practices. And, in terms of data security of a platform technology, there are also advances that will ensure the safety of private data. I think if new companies offer platforms and services with the latest and greatest best practices and security capabilities available, we should be in good shape. Though, of course, caution is warranted. Looking forward to future conversations! 

Previous post:

Next post: