Sara Heath at EHR Intelligence recently published a post which pointed out the increasing dissonance between EHR Ranking Systems and Usability data. She pointed out that while the big players in the EHR market like Epic and Cerner consistently receive great rankings for their platforms, the end user experience of these EHR’s is a very different story. While the best in KLAS ranking went to Epic, Heath pointed out other surveys that ranked Epic last in usability and that in the industry as a whole, 54% of users are displeased with their EHR program.
I’ve discussed some of the reasons behind this in a previous post. It is important to note that negative feedback from users doesn’t necessarily diagnose a usability problem. There are many other factors that could be involved, including negative attitudes towards technology in general, poor training, poor conversions, etc. The fact that satisfaction with an EHR platform increases substantially after it hits the five year mark seems to indicate that a good part of the problem may be related to training and support; after five years of using the platform, you finally get it mastered and feel good about using it.
But for anyone that is interested in usability this begs the question “Why should it take five years to learn a system enough to be comfortable with it?” You are working with this platform on a daily basis, at least five days a week. That’s 260 days/year, assuming you don’t take a vacation or need to use sick time. If it takes five years to get comfortable with your software that’s 1300 days! Honestly, if it takes that long to learn how to use a software system, then there is something very wrong with that software platform.
Heath points out that a good portion of the disconnect could be stemming from the fact that the rankings that are used are industry-based rankings. Industry experts are ranking the EHR products against each other. You can see this in the often hear praise for things like innovation. Innovation is something that can only be measured by someone who knows a thing or two about what they are measuring. The average user cannot make that kind of a call. So is this a case of the fox being placed in charge of the hen house? Seems likely.
Now I’m not saying that industry rankings should be dismissed or that they have no merit. They definitely should be considered if you are considering a switch. But equal weight should be given to the usability of the system, which is something that I strongly feel needs to be evaluated and ranked by outside sources. Its like when you were in college writing essays; after a certain point you become a terrible proofreader of your own work, because you aren’t actually reading it. Because your brain knows what you meant to say, it literally doesn’t see the typos or errors or gaps in the ideas; you need a second set of eyes to catch things like that. The vendors who create the EHR systems are too close to the platform to be able to evaluate it for usability. That is where the users come in, and while they are being vocal in their displeasure it has not been loud enough.
I would love to see an outside accreditation process for evaluating the usability of EHR software, something the vendors would have to compete for but would have no hand in. Adjustments being made to Meaningful Use might go some of the way towards incorporating usability, but more is needed.