Anyone involved in Information Technology of any kind is familiar with the concept of standardization…if they are smart they are also extremely appreciative of it. Several decades ago when computers and the networks and infrastructure they are now a part of were being developed and spreading across ever-larger areas, it quickly became apparent that without standardization in important areas like I/O Ports and the like, the growth of the technology would be limited by the technicians knowledge of and availability of the interfacing connections and devices you would need to connect them. If these kinds of factors were limiting the size of your network, it was also going to limit exactly what could be achieved with the network once it was built. This led to the call for standardization across computing technology and network infrastructure that has become a hallmark of the industry, allowing computers manufactured in one country to easily communicate and share information with computers in another to achieve effects and efficiencies that wouldn’t be possible if the networks were smaller.
Banking and financial services runs much the same way, even though the industry is much older than the technology that serves as it backbone today. The data and procedures for handling/transmitting/protecting the data is standardized in such a way as to increase efficiencies when data is transmitted from various institutions. Using your debit card at a local store flows the information to the card processing vendor (who may or may not be affiliated with the bank the store is using), who requests the funds from your bank or credit union (often flowing through the Automated Clearing House or ACH) and deposits them in the businesses’ account. Depending on whether the institutions involved are what the industry terms Real-Time/Live Processing the entire transaction can take a matter of seconds. If the institutions involved are Batch Processing, it may take several hours to hard post, depending on what time of day the transaction was run and what time Batch Processing is scheduled to begin.
Part of the reason this standardization can happen in this industry is because the data within banking is relatively limited in type. Money is money. Whether you are transferring funds from your checking account into a savings account or if you are making a loan payment, money is the only thing actually moving. What is important in the banking world is the status of the funds; if the money represents a credit transaction or a debit transaction. Using this information you can trace flow, which yields a substantial amount of data and meta-data that can be used for many purposes later on. Another reason this standardization occurs is because of a dirty word that many don’t like to discuss: Regulation. There was a time when banking was not a standardized industry; hundreds of years ago when fly-by-night institutions would come, convince people to give them their money and then fold up and leave town leaving the customers broke and with no recourse to recover their funds. There were also times when it was common to have a Run on the Bank, which was so aptly shown in the classic Disney film Mary Poppins. Thanks to regulations in how financial institutions are run and how they process transactions, the risk of a modern-day Run on the Bank is near zero. (I am speaking more of community banks rather than bigger, Wall Street Investment Banks, who as we have clearly seen can do things to create a modern-day Run on the Bank. They are a completely separate discussion.)
That brings me to healthcare IT. I have been watching the field with fascination for many years as something of a more-than-average observer. I am a big proponent of using technology not only to allow you to do things that wouldn’t be possible without it, but also to help provide better, safer and more cost-effective care while at the same time making your staff’s job easier. Since I am naturally a Systems-Thinker, it’s relatively easy for me to see areas within any system that could use improvement. But is the technology up to the task in the case of healthcare? After all, interoperability is a key buzzword in the industry, and the ONC recently announced that it was the driving force behind the next decade of Healthcare IT development. I believe that the technology is 75% of the way towards being able to handle the change….but the practice of healthcare as done in the US is not ready for this change.
A few weeks ago I had a very interesting conversation with a nurse and Healthcare IT professional who both work for the same facility. The conversation included quite a bit of discussion about the number of different technology programs and systems the network had to support, not to mention dealing with all of the different vendors who create the programs. Simply trying to ensure that these programs remained functional on the network, that updates to one wouldn’t break something else connected to it, and generally achieving compatibility across platforms was an ever-growing challenge. The nurse pointed out that it was indeed a challenge from a usability perspective having to go though all of these different software programs to do her job, but that it was necessary because there were so many different nursing languages that were used in different situations and she hadn’t seen a system yet that could accommodate all of them. “Isn’t there a single program that can do it all?” She asked.
The painful answer to that question is No…there is not a single system that can do everything. It isn’t for lack of trying: Many of the larger vendors like Cerner or Epic are branching out in an attempt to become a one-stop-shop for everything a healthcare system would need. There are many reasons why this goal remains elusive, but one of the big ones is a lack of standardization of the data they are working with. Is it possible for one software platform, for example an Electronic Health Record (EHR) or a Clinical Decision Support System (CDSS) to accommodate all of the multiple nursing languages that can be used in a given facility? Technically yes….it could be programmed that way. But the programming would be so complex that it is not cost effective for even the big vendors to invest in it, let alone to price it so that it could be sustaining and hospitals could afford to by it. Faced with this dilemma, the vendors are doing what is natural: carving out a niche for themselves and integrating many types of software sub-programs within that niche. That works fine for the vendors, but it puts hospitals and medical facilities in the position where they are forced to buy from several different vendors to support their operations and try to integrate them all on a single network, leading to nightmares for IT/Vendor Management/Risk Analysis/Compliance/Usability Perspective.
So how do you solve an issue like this? Standardization, just like what occurred in the banking/financial services industry which allowed integration between institutions which now allows funds to flow almost instantaneously. This kind of data could flow across healthcare systems as well, but not until it is standardized. In the banking industry, this did not occur until regulations and increased oversight mandated it, as individual vendors and institutions were unwilling to work together on it because it might present risks to their business in the short to medium term. Can the Healthcare sector avoid this? Maybe, if they learn from the lessons of Banking and other industries who have gone through the same radical changes.