Cybersecurity is a burgeoning sector within Information Technology, and it’s not just in financial or national defense industries. The healthcare industry is also actively pursuing increased cybersecurity measures. I’ve discussed cybersecurity in the healthcare industry in several previous posts; for example in Compliance is Not Enough I discussed Symantec’s 2015 Internet Security Threat Report and how cybersecurity isn’t just about keeping hackers out, it’s also about developing policies and procedures to prevent data loss as a result of your own employees. In The Difference Between Proactive and Reactive Thinking, I examined how small medical facilities in rural areas are victims of data breaches just like the big guys, but they tend to be more reactive in their responses which is a major disadvantage for rural health facilities which are already facing numerous challenges to their existence. Finally, in The Intersection of Usability and Security, I examine how two of the biggest areas of emphasis within healthcare IT, usability and security should compliment each other in order to increase the effectiveness of both. But today I’d like to look at another aspect of cybersecurity in healthcare: the recruitment of cybersecurity professionals.
TechRepublic, a newsletter and website maintained by IT professionals, recently published a post detailing the need for cybersecurity professionals within Health IT. The article starts by showing the healthcare industry is the victim of a majority of data breaches that have occurred (at least in the first half of 2015 according to Gemalto, healthcare came in first place at 34% of the breaches, followed by government at 31%). The article then points out the information that is leaked in these breaches tends to be things that can be used in identity theft and cannot be easily changed; most of what is stolen is names/addresses/social security numbers and employment information, none of which is as easy to change as your credit card number should your card number get stolen.
The article speculates that a major factor in the susceptibility of Healthcare IT to data breaches is a lack of qualified professionals within the field. Part of the problem, the article states, is that there is a lack of qualified cybersecurity professionals overall; there has been a 91% increase in demand since 2010, and in healthcare the increase was 121%. Coupled with this huge increase in demand has been the fact that a huge majority of the positions in this sector list a Bachelors degree (84%) and three years of previous experience (83%) as requirements to even get your foot in the door. Adding to this is the fact that healthcare IT professionals also need to be familiar with major regulatory frameworks, including the Health Insurance Portability and Accountability Act(HIPAA), Health Information Technology for Economic and Clinical Health Act (HITECH) and Payment Card Industry Data Security Standards (PCI DDS).
Now HIPAA has been around for a long time, but HITECH was only signed into law six years ago, and PCI DDS is not actually a regulation, but an industry-wide set of standards the businesses who process credit/debit card transactions “voluntarily” adhere to in order to be able to process transactions from the major players in the card industry like Visa and Mastercard. If you aren’t working within the financial or retail sectors, you probably hadn’t even heard of this until two things happened: 1) Everyone started paying for everything with debit/credit cards and 2) Major retailers like Target were the victims of major hacks which cost the card industry and the retailers millions of dollars. Until these two forces converged, no one outside of financial/retail really had to give this one much thought. But now that they are converged, it’s another set of acronyms to add to your repertoire as a cybersecurity professional, no matter what industry you are working in.
What does all of this mean for the lack of availability of cybersecurity professionals within Healthcare IT? I believe that a part of the problem is something that Healthcare IT, and many other industries, are facing: the assumption of a “skills gap”. As Human Resources has professionalized and industries of all sorts have begun making their operations and staffing decisions based on a LEAN framework, the job descriptions for open positions have become more detailed and more rigid. It doesn’t help that many industries are still operating as if it were 2010-2012 and that they can find professionals with all of the degrees and qualifications they want because we’ve just gone through a massive recession and there are a glut of workers available.
Over the years, the working world has changed as well, in ways that play into the manifestation of this “skills gap”. Make no mistake, a skills gap has always existed between what employers want and the skills that employees bring into the company or industry; the difference is in how employers are responding to that gap. In previous decades, employers were much more willing to hire an employee with a gap in skills and provide them the training they needed in order to close that gap. Now, however, a simple browsing of job posting boards reads like every employer is looking for a unicorn for every position that they are hiring; if you don’t meet everything right out of the box, they aren’t going to help you get there…they will just keep looking.
Notice I have put “skills gap” in quotes; I have done this very deliberately because I do not think the skills gap exists in the way that the business world seems to think it does. It has always existed, it was just addressed with career development type training on the employer’s side in previous decades. Now, employers expect new hires to have not only college degrees and extensive experience, but within the tech industry expensive certifications. Many employers are unwilling to pay for employees to get these certifications anymore, and I don’t know too many people holding down a job and providing for a family who have that kind of extra time/cash laying around to be able to do it yourself.
I am in no way suggesting that cybersecurity professionals should be hired when they are woefully under-qualified. Requiring a degree and/or experience is definitely a good thing (although requiring a degree AND several years of experience can be problematic, especially for recent grads). But as I’ve looked at positions within Healthcare IT, I have often seen clinical experience listed as a requirement, even for positions that aren’t directly clinical. And as we all know, clinical work is a closed system that has a high barrier for entry in the form of specialized degrees.
Is it truly necessary for cybersecurity professionals to also have clinical experience in order to get their foot in the door in Healthcare IT if they are going to be doing cybersecurity? If it is, then you are always going to suffer from a skills gap. I have a strong suspicion that cybersecurity professionals, as well as IT professionals in general, from any other industry could successfully transition to Healthcare IT with some training on the clinical side as needed, if only the Healthcare sector would let them.