Chapter 4: Technology

I. Summary

Most conversations around patient-centered healthcare technology center on leveraging innovations in artificial intelligence, machine learning, big data, natural language processing, and other frontier technologies to improve or enhance clinical outcomes. While these technologies show promise for positively impacting clinical outcomes and, indeed, that should be the primary focus for healthcare providers, there is equal opportunity to consider how these technologies can be applied to improve the cybersecurity of medical devices, patient records, and the overall healthcare infrastructure. Yet, this remains a nascent conversation, even for the most progressive healthcare enterprises. Likewise, although there is a more robust conversation around existing cybersecurity flaws in medical devices and software, little has been done to shore up the technological infrastructure of our nation’s healthcare providers.

Indeed, recent ransomware attacks on hospitals have shown how easy it is to bring a health system back to pen and paper. In a recent series of attacks, several hospital systems had their EHR systems rendered inoperable, forcing everyone to use handwritten notes to coordinate care. The true costs in lives and resources of these events are difficult to calculate, but intuitively, that lost insight has a real impact on patient lives.

This is troubling for a number of reasons, especially considering the increasing pace of attempted and successful cyber attacks directed at the healthcare industry in recent years.1 Unmitigated vulnerabilities create potentially existential medical, financial, and reputational risks for providers. Some of these problems, which are described in more detail in the next section of this chapter, are summarized in the table below.

Table 3: Summary of Healthcare Technology Challenges

Area Unique Characteristic or Challenge
Legacy technologies Medical devices and software are often used for a long time. Many have vulnerabilities and do not support patches.
Incident readiness Even though cybersecurity incidents occur regularly, few healthcare delivery organizations or device manufacturers have plans in place to prevent future cyber attacks.
Budget Limited budgets and tight margins relegate cybersecurity to a secondary priority
Regulatory guidance OCR’s guidance starts with asking that organizations conduct a risk assessment of their environment. That process, however, has been reduced in practice to superficial checklists that leave vulnerabilities unaddressed. There is both a need for health systems to be more creative on this front, as well as OCR to provide more examples.
Small- and medium-sized organizations Burdensome regulatory environment, just-in-time supply chain, and risk aversion prevent smaller organizations from investing in cybersecurity.
System development life cycle (SDLC) practices SDLC practices for medical devices tend to be weak and under-regulated, not end-to-end secure.

Despite these problems, the healthcare sector possesses a unique opportunity. Since many organizations have yet to introduce many basic cybersecurity protections and technologies, the sector can “get it right” the first time, rather than trying to reshape an already entrenched cybersecurity infrastructure and culture. Policymakers can encourage a movement towards healthy cybersecurity technology posture in a number of ways:

  • Create a government-backed program to encourage the phasing out of legacy technologies and phasing in of secure and interoperable technologies.2
  • Learn from the financial sector’s success in sector-specific cybersecurity investment, spearheaded by National Cybersecurity Center of Excellence.
  • Leverage a broad array of existing funding programs to spur healthcare cybersecurity basic research and innovation.
  • Create mechanisms for clarifying privacy standards, providing advice, and receiving feedback from health systems, similar to the levels of determination issued by the Internal Revenue Service.
  • Strengthen FDA requirements around medical device security, to ensure that security is baked-in at every point in the device’s life cycle.

This chapter describes the technological challenges facing the healthcare sector. While this description presents a stark picture of the many challenges facing healthcare, it also foretells the many policymaking opportunities that are borne out of these shortcomings, which are highlighted in the recommendations at the end of the chapter. The range of technologies covered herein is large and includes medical devices (including those that are part of the “medical Internet of Things”), enterprise IT, the cloud and cloud-connected devices, medical device applications and software (perhaps most notably including EHRs), smart building infrastructure, and more.

II. Healthcare-Specific Technology Challenges

The opening statement from the Hippocratic Oath for Connected Medical Devices, a symbolic attestation for the healthcare community crafted by the security and public safety group I Am the Cavalry, reads “New technology introduces new classes of accidents and adversaries that must be anticipated and addressed proactively…The once distinct worlds of patient safety and cyber security have collided.”3 Others have echoed the same sentiment. The June 2017 Health Care Industry Cyber Security Task Force states, “Now more than ever, all health care delivery organizations…have a greater responsibility to secure their systems, medical devices, and patient data.”4 These statements make two similar assertions:

  • That even the most promising advancements in medical technology could have an insidious flaw that places patients in harm’s way.
  • That healthcare providers and policymakers have a responsibility to proactively address these flaws before they are exploited.

We will address these assertions each in turn, first discussing the flaws associated with medical technologies in this section, and next section offering healthcare providers and policymakers recommendations on how to address these flaws.

i. Legacy technologies

Rapid advances in medical device and electronic health technologies have equipped the healthcare sector with a new suite of tools aimed at improving patient outcomes. However, these advancements have created a number of legacy technologies that are vulnerable to cyber exploitation. Legacy technologies are devices and software that are old, outmoded, or outdated in some fashion, but that are still in use. Due to the length of time these devices and software have been in use, malicious actors and threat researchers have been able to identify a large number of vulnerabilities and exploitable security flaws; at the same time, cybersecurity vendors often provide few modern countermeasures for legacy devices. Exploiting a vulnerability within a legacy technology can lead to “medical device malfunction, disruption of health care services (including treatment interventions), and inappropriate access to patient information.”5 The impact of the 2017 global WannaCry ransomware attack is a stark example of the vulnerability of these legacy technologies.6

One factor contributing to the legacy device problem is the lifespan of medical devices, enterprise IT, and systems that house EHRs, which can be used by a healthcare organization for upwards of 15 or 20 years. Old hardware and devices are not necessarily a cybersecurity problem in and of themselves. Rather, the challenge posed by these devices resides in the software they run. The operating systems and off-the-shelf software that undergird these devices have relatively short lifespans, with new versions launched regularly. As new versions are issued, software vendors often discontinue support for previous versions, leaving them largely unpatched and vulnerable. Because newer software rarely finds its way into these systems, outdated medical devices remain in operation, even when the software originally designed to support them have long been discontinued. Further compounding this problem is that devices manufacturers, healthcare providers, and software companies contest who is responsible for identifying, issuing, and implementing security updates.

More often than not, the disclosure of a vulnerability on a legacy medical device is contained in a list of publicly known cybersecurity vulnerabilities known as a Common Vulnerabilities and Exposures (CVE) report. Oftentimes, even after a bug is identified in a legacy system, they do not get patched either because a patch was never issued or it simply was not implemented.7 CVE reports often include statements to the effect of "there is no patch available to address this vulnerability.” In these scenarios, healthcare organizations are left to rely on their existing security infrastructures—such as firewalls and defense in depth models—to protect medical devices from being exploited. However, the security infrastructures that support a healthcare organization’s medical devices and enterprise IT systems often fail to adequately reduce risk, as discussed in the next section.

ii. Incident readiness

An independent report from the Ponemon Institute published in May 2017 found that 67 percent of medical device makers and 56 percent of healthcare delivery organizations anticipated that an attack against one or more of their medical devices would occur over the next 12 months.8 Beyond these troubling forecasts, manufacturers and organizations admitted to past instances where incidents had negatively affected patient health or privacy: 31 percent of device makers and 40 percent of healthcare delivery organizations admitted to being aware of these sorts of incidents. Of those respondents, 38 percent of healthcare delivery organizations said they were “aware of inappropriate therapy/treatment delivered to the patient because of an insecure medical device.” Furthermore, “39 percent of device makers confirmed that attackers have taken control of medical devices.”9 These statistics illustrate the worrying number of confirmed incidents affecting patient privacy and the security of care delivery.

Despite the acknowledged risks, the sense of urgency to attenuate the weaknesses found in the medical devices appears to be low: only 17 percent of manufacturers and 15 percent of healthcare delivery organizations are taking significant steps to lessen the impact of future cyber attacks.10 Outside budgetary restraints, the reasons why organizations are not doing more to improve cybersecurity are complicated. It may be because they are not sufficiently motivated to invest in cybersecurity by negative factors, like cost. Research shows that HIPAA and FDA requirements are surprisingly ineffective at ensuring the privacy and security of medical systems.11 Or, there may be enough health-specific information in the wealth of proprietary and open source resources for creating effective incident response plans. This paper offers policy to address both reasons.

iii. Budget

Without repeating the extensive conversation on constrained healthcare budgets from Chapter 3, it bears repeating: most health organizations operate on extremely tight budgets. As a result, healthcare leaders are compelled to make tradeoffs during budget planning, which often results in the relegation of cybersecurity to a secondary priority behind such things as hiring additional clinical staff.

iv. Regulatory guidance

This conversation builds on the previous discussion around HIPAA compliance and implementation from Chapter Three – Culture. NIST, HITRUST, OCR, and others have been key in providing guidance to health systems on privacy and security matters related to HIPAA. However, the increasing use of technology across the health system has outpaced the attendant guidance. There was little interaction between technology and the healthcare setting when OCR and others were initially given oversight for HIPAA. Today, however, a huge amount of privacy and security concerns relate to technology. An opportunity thus exists for convening an ever more robust and dynamic discussion about best practices, which should engage all relevant regulators and health systems alike.

Historically, privacy and security risk audits within the healthcare system have only examined random samples or used basic checklists to monitor HIPAA compliance. This approach simply audits the tip of the iceberg, leaving a vast number of records unexamined. Government agencies have not yet embraced new technologies, like artificial intelligence, to allow for proactive, risk-mitigating privacy and security solutions capable of fully comprehensive audits. A selective and narrow approach leaves such a wide swath of cybersecurity vulnerabilities unchecked that patient privacy and security remain at significant risk.

v. Small- and medium-sized organizations

While small- and medium-sized organizations do not manage the same volume of patient data as their larger counterparts, they still provide vital services that require the collection of sensitive patient information. Further, the highly interconnected healthcare ecosystem means that a single disruption could cause ripple effects that destabilize the industry as a whole. With the introduction of just-in-time supply chain delivery models, for instance, most healthcare delivery organizations operate with “very limited inventories, diagnostic capabilities, or capacity in an emergency, making many healthcare providers sensitive to cascading consequences in the context of a system-level disruption.”12 It is not difficult to imagine a cyber attack impacting a few vulnerable small- and medium-sized organizations, sparking a cascading system-level disruption that is amplified by the limited quantity of inventoried supplies.

Small- and medium-sized healthcare organizations face tremendous difficulties in maintaining a healthy cybersecurity posture. For some small- and medium-sized organizations, investing money to improve cybersecurity capacity is simply perceived as a drain on the bottom line, so they take a chance that nothing bad will happen, often thinking that they are too small to draw negative attention. For other small- and medium-sized organizations, there is often a misperception that implementing security is only achievable through an expensive onboarding of in-house resources, rather than looking to external managed service options that are available to them.13

Furthermore, the regulatory environment is perceived by many to be overly burdensome.14 In particular, many point to existing regulations in the Anti-Kickback Statute and Stark Law15 that, while important for protecting against fraud and abuse, prohibit the pooling of valuable cybersecurity resources that could benefit small- and medium-sized providers.16

vi. System development life cycle practices

When a medical device vulnerability is discovered by a manufacturer or a cybersecurity researcher, healthcare organizations often take one of two actions: either they scramble to apply a patch or conclude (sometimes wrongly) that the impact of the vulnerability to their existing network is minimal. As more and more vulnerabilities are discovered on medical devices, one begins to wonder what type of security practices are performed by actual medical device manufacturers. If medical device manufacturers provide cybersecurity support for their devices, do these services include security testing17 from development through end of life? If manufacturers are in compliance with FDA regulations covering medical devices, but fail to implement robust security testing throughout the development life cycle of a product, what does that say about the strength of existing FDA regulations? These are some of the questions that come to mind when discussing the ever-increasing number of medical device vulnerabilities. The answer to those questions vary widely depending on the medical device manufacturer, but studies indicate that it is fairly common to follow weak security standards.

According to the 2017 Ponemon Survey of healthcare organizations and medical device manufacturers, 43 percent of medical device manufacturers do not conduct security tests (35 percent) or are unsure if end-to-end security practices takes place (7 percent) pursuant to an established secure SDLC process during the development of devices.18 A secure SDLC process is an end-to-end security practice that better ensures device security because it is a continuous concern throughout the entire development life cycle, from initial requirements to end-of-life. Even if manufacturers do conduct security tests during device development, only 9 percent of manufacturers say they continue to conduct tests of their medical devices at least annually, a failure in applying effective, end-to-end secure SDLC practices. Healthcare delivery organizations are similar, with 53 percent not testing their devices (45 percent) or unsure if testing occurs (8 percent). While testing is only one phase within the secure SDLC process, this report sheds light on the existing gaps in applying effective, end-to-end secure SDLC practices, as well as the lack of enforcement mechanisms available to government regulators.

Regrettably, this is not surprising. The FDA is the lead government agency tasked with medical device cybersecurity and it has purposefully left cybersecurity regulations vague and largely up to the discretion of medical device manufacturers. According to the FDA, the reason for leaving regulations broad is “because the regulation must apply to so many different types of devices.”19 However, given the high level of medical device vulnerabilities and the admission of poor alignment with established secure SDLC practices, it is clear that the FDA’s one-size-fits-all approach has not been successful. It is easy to understand why when reading the existing FDA guidelines, which allow for incredibly loose interpretations of pre- and post-market cybersecurity considerations.20 FDA guidance and standards do not even require medical device manufacturers to conduct specific (and basic) security tests throughout a product’s development life cycle—including static and dynamic application analysis, code review, and penetration tests.21

The regulations only require manufacturers to “establish design inputs for their device related to cybersecurity, and establish a cybersecurity vulnerability and management approach as part of the software validation and risk analysis,” which leaves a large amount of discretion up to medical device manufacturers.22 Post-market, the FDA provides a series of guidelines for how manufacturers can “implement comprehensive cybersecurity risk management programs and documentation consistent with the Quality System Regulation (21 CFR part 820).”23 However, these guidelines are also broad, allowing manufacturers to conduct sporadic and self-defined post-market testing that could easily fit within the broad regulatory framework.

It is no wonder that medical device vulnerabilities are so prolific given the fact that the existing regulations do not explicitly require even the most basic cybersecurity testing requirements.24 Indeed, this is the state of the software and hardware industry writ large, but it should not be the status quo for medical devices given the particular risks to individual wellbeing. Without effective, end-to-end secure SDLC practices incorporated throughout the life cycle of a medical device, design flaws are likely to occur. These flaws could allow unauthorized access, introduce medical risks to patients, and risk the integrity and security of the data generated by a medical device. The reality is that, even with good testing practices, vulnerabilities would still appear at other points in a device’s lifecycle. Ensuring end-to-end secure SDLC practices would go a long way in reducing the risks across the entire lifecycle of a device.25

Given the myriad challenges around healthcare technologies, including the proliferation of legacy technologies, poor security infrastructures and incident response plans, limited budgets and tight margins, vague privacy and security standards under HIPAA, particular concerns affecting small and medium providers, and weak end-to-end secure SDLC practices for medical devices, it is important for policymakers to take action. The next section details a set of policy recommendations that would empower policymakers do just that.

III. Healthcare Technology Policy Recommendations

The recommendations contained in this chapter focus on identifying technological opportunities and challenges facing the healthcare sector to improve overarching cybersecurity infrastructures and procedures. Together, these recommendations present a vision for the future healthcare cybersecurity tech landscape where legacy medical devices are a thing of the past, new sector-specific cybersecurity technologies are available to all healthcare organizations regardless of size, and regulations and guidance are clear and support comprehensive privacy and security standards that keep patients and data safe.

At the simplest level, these recommendations aim to square the incredible benefits of emerging technologies with the attendant cybersecurity risks they introduce. This means that, in five to ten years, as patients benefit from the incredible advances in artificial intelligence to predict their individual likelihood of getting sick, confidence that their personal information is protected by privacy and security officers equipped with innovative sector-specific tools will also be possible. Patients will be able to analyze data from their wirelessly connected and implanted medical devices, knowing that those devices were designed with the most robust security testing available and that continuous updates address new vulnerabilities. And healthcare providers will be able to ensure the trustworthiness of their networks so that, even in the face of an internal or external malicious attacker, they can continue delivering immediate, uninterrupted, quality care.

The policies proposed in this chapter are directed towards government agencies including the FDA, the HHS OCR, NIST and its National Cybersecurity Center of Excellence (NCCoE), and DHS, congressional leadership, industry accreditation groups like The Joint Commission (TJC), as well as medical device manufacturers and medical leaders who have influence over institutional policies within the healthcare sector.

It is additionally important to note that the ONC’s proposed Trusted Exchange Framework and Common Agreement put these needs in particular focus. These programs offer potential “levers” for setting a higher standard for information exchange, as well as risks if the framework is insufficiently demanding.

Recommendation #4.1: Create a government-backed program to encourage the phasing out of legacy technologies and phasing in of secure and interoperable technologies.

This recommendation builds on the 2017 Task Force Report observation that legacy systems must be secured.26 To have maximum impact, we recommend focusing on the following more specific recommendations.27

With guidance from health care accreditation organizations like TJC, and input from the government agencies (e.g., HHS, ONC, and FDA), Congress should draft an incentive program that seeks to phase out legacy systems, potentially through Medicare and Medicaid reimbursements. The Medicare and Medicaid programs already offer reimbursements and special incentives directly to hospitals for healthcare expenditures. Currently, these programs award a finite amount of money according to a variety of procedural and quality-based outcomes. It is equally important that cybersecurity and privacy outcomes be included in these quality measures since they greatly affect patient safety, dignity, and trust.

Congressional leaders should evaluate incentive options within the Medicare and Medicaid programs to encourage organizations to migrate security services to more trusted, state-of-the-art systems. Since the FDA is responsible for ensuring proper medical device security, they should take the lead in phasing out legacy medical devices. ONC is equipped with the authority to write regulations around the minimum security functionality for EHRs, so they should take the lead in phasing out legacy EHR components. One recommendation from an expert in the field would be to concurrently implement a faster cycle time on security standards, given the length of time it takes to make a rule. It is important that these efforts be harmonized across agencies since the work of one will greatly impact the other. This push would tie in nicely with the interoperability effort being led by the CMS and would be a great opportunity to incentivize privacy and security as an attendant concern related to interoperability.

Grants, vouchers, and/or tax incentives could be provided to partially offset the costs associated with this transition, with ongoing payments tied to performance and dispersed through a reimbursement mechanism. Any program should be flexible, providing incentives that are tailored to the size and unique needs of each health organization. One way to do this would be to create mechanisms within the Medicaid system, such as demonstration or innovation waivers, that would allow states to experiment with individualized incentive systems. Furthermore, the eligibility of new technologies should be held to a minimum standard of medical device cybersecurity, similar to the provisions outlined in the Internet of Things Cybersecurity Improvement Act of 2017.28 As a result, the incentives disbursed through the Medicare and Medicaid programs would encourage healthcare providers to purchase new equipment and phase in more interoperable and secure technologies.

Existing programs like the Modernizing Government Technology Act, which created a $500 million fund for updating legacy systems across federal agencies, could also be applied to federal health systems or even expanded to include state and local health organizations.29 While this program is structured to loan money for capital projects, it could certainly be used to provide resources for health systems that are paid back over time. Another way to hasten this transition is for TJC and the FDA to update their accreditation and regulatory processes to include a cybersecurity interoperability requirement for new technologies.

We acknowledge that the provision of public money to private entities is challenging. But in this particular sector—due to the direct patient safety issues that arise from poor cybersecurity, pervasive budgetary challenges and low margins, the need to (rightfully) prioritize clinical care over investments in cybersecurity—it makes sense to provide these incentives.

Mistakes were previously made when medical devices and EHRs were released into clinics with poor security infrastructures baked-in.30 In order to avoid repeating a similar mistake when replacing legacy systems, a more robust set of Meaningful Use (now a part of MIPS, under MACRA) requirements articulated in the Certified Health IT Products List (CHPL) is essential. While there was, historically, a challenge with measuring privacy and security efficacy beyond specific EHR requirements in Meaningful Use, we believe that leveraging a risk-based framework to do so is both feasible and essential. There are many stakeholders involved here—health systems, EHR vendors, security vendors, government actors, patients and many more—but a convening and standard setting is necessary. Implementing these solutions would result in the replacement of unsecure legacy systems with new systems that are both secure and interoperable.

Recommendation #4.2: Learn from the financial sector’s success in sector-specific cybersecurity investment, spearheaded by NCCoE.

Cybersecurity challenges across industries are more similar than they are different, but the healthcare sector has many of its own idiosyncrasies. One-size-fits-all solutions often do not work for healthcare because they fail to address the intricacies of HIPAA, issues around sensitive PHI, the contrasting requirements of easy access and strong protection of EHRs, and other factors. For instance, consumer-grade multi-factor authentication (MFA) solutions could catastrophically fail in a Code Blue situation. Caregivers would have no time to input a code from an app or text message to unlock their computer if a patient were literally dying in front of them. Instead, MFA on a clinical workstation must be able to unlock in a fraction of a second, which products currently in the market achieve using biometric scanning or security tokens.

In order to develop cybersecurity solutions tailored to the health sector, the major players should follow relatively successful model of the financial services sector. Large banks with enough liquidity created internal incubators or accelerators that they use to purchase promising cybersecurity companies in order to steer them towards the development of tools tailored for the idiosyncrasies of the financial sector.31 The benefits of this approach for the finance sector are twofold. First, and obviously, it catalyzes an industry dedicated to developing tools these companies can use in their core businesses. Second, the provision and scaling of these promising companies is a legitimate investment opportunity. If incubated or accelerated companies succeed and gain traction across the financial sector, the bank that invested in them stands to profit. Though the margins in the healthcare sector are, on aggregate, much smaller than those in the fiance, a number of larger healthcare providers have sufficient profit margins to emulate the financial sector model. A few may even be large enough to support this kind of program on their own in internal incubators, like the one Goldman Sachs recently launched.32

NIST’s NCCoE can serve as a coordinating body to gather the major players of the healthcare industry necessary to form this group of potential investors. The role of government, in this case, should not be to fund or develop the technologies, but to provide information and models for the healthcare sector to take on the task themselves. Sector leaders may benefit from some guidance in conceptualizing the cybersecurity needs of the healthcare sector, and to this end we put forth a three-tier framework:

  • Universal platform technologies. Tools that assure basic cyber hygiene, including but not limited to coverage from endpoints to data centers, intrusion response and prevention capabilities, and hybrid deployment options.33 These are necessary in any industry and are offered by a number of commercial platforms.
  • Industry-specific technologies. Tools that map onto the unique needs of the health sector, such as EHR privacy monitoring, PHI de-identification, and patient portal security.
  • Subsector-specific technologies. Tools that are only needed for healthcare providers of certain specialties or sizes. Examples include complex PHI de-identification tools for research institutions, managed service providers for smaller enterprises, and automated IoT monitoring for providers with significant hardware needs.

NCCoE may want to encourage group of industry leaders to focus more on the second and third tiers, since they have the fewest existing industry solutions. NCCoE can further help the industry by increasing how often it publishes guidance on best practices surrounding existing health care technology. NCCoE has already done commendable work in this regard on the second tier—in its guidance on EHRs and picture archiving communication systems—and the third tier—in its analysis of infusion pumps and pacemakers.34 This work should be scaled up and built upon to address a much wider array of challenges, and guidance priorities should be laid out in a three-year roadmap.

Recommendation #4.3: Leverage a broad array of existing funding programs to spur healthcare cybersecurity basic research and innovation.

In order to keep up with the evolving cyber threats that face the healthcare sector, there must be a paradigm shift regarding current research efforts in healthcare cybersecurity. While a suite of recommendations encouraging additional research is not particularly novel, it bears repeating.35 Building on the work of the ONC, NIH labs, and the NCCoE, the government should lead the development of an overarching initiative that informs where healthcare cybersecurity may be going in the next five years and future research strategies that could inform that thinking. The first stage of this process should resemble a net assessment, or compilation of active research and development (R&D) efforts aimed at identifying emerging or future threats and opportunities. R&D efforts within this vision could include a focus on securing the cyber-physical vulnerabilities present in connected medical devices and autonomous systems and using emerging technologies like artificial intelligence, big data analytics, quantum cryptography, and blockchain to protect patient data.

Existing funding programs like the Small Business Innovation Research (SBIR) program and the NIH’s “R01” standard independent research project grant are important for job creation and innovation. SBIRs are already one of the largest government-industry partnerships with regards to annual budget, but more needs to be done to focus research efforts in critical need areas like healthcare cybersecurity.

There is growing empirical evidence that government sponsored research projects like SBIRs and R01s are particularly effective at catalyzing innovative projects that would not have otherwise been completed in the absence of funding.36 Relatively small investments from the government in high tech industries can assist a government up the learning curve and down the cost curve, creating permanent advantages in key industries like healthcare.37 A supportive policy framework is needed for entrepreneurs and growing firms to bring welfare-enhancing technologies to the healthcare sector. As such, SBIR funding models at the federal and state levels and R01 grants should be expanded to support research specifically in the healthcare cybersecurity field. Policy experiments in other government agencies may provide some guidance on how to develop these funding models.

Policy experiments like the Department of Defense’s Fast Tracking SBIR funding model have proven to be particularly effective at aligning departmental goals with an economy in which rapid innovation is rewarded. Fast Track funding increased the effectiveness of SBIRs by encouraging commercialization of specific products and technologies that also met the program’s objectives.38 Increasing SBIR funding for healthcare-specific research that has the potential for commercialization would encourage the rapid innovation and scalability needed to help mitigate the threats facing healthcare.

Beyond monetary investment from the government, SBIRs catalyze further investment from the private sector. SBIRs play a certifying role that can signal to private sector investors that an organization is trusted and worthy of investment. Private investors know that SBIR-funded enterprises have to go through a rigorous application and assessment process. Trust in this process mobilizes further private sector investment in an SBIR company’s technology and future commercialization. Without the credibility provided by an SBIR, private investors would be less likely to invest.

Even if an SBIR-funded healthcare cybersecurity business or employee fails or exits the market, there are still gains to be made. The human capital expertise developed through an SBIR-funded research project sticks.39 Since this expertise can be applied to other companies, it has economic value, especially for the chronically understaffed healthcare cybersecurity workforce. Moreover, there is a huge spillover effect from SBIR funded projects.40 This means that the net benefits to society stemming from SBIR-funded projects are much greater than projects that do not receive SBIR funding: an 84 percent social rate of return for SBIR-funded projects versus a 25 percent expected rate of return for non-SBIR funded projects.41

There are numerous examples where the federal government used tax dollars to invest in R&D projects aimed at meeting specific grand challenges, many of which are comparable to the healthcare cybersecurity vulnerabilities that currently face the nation. For example, the not-for-profit consortium SEMATECH (Semiconductor Manufacturing Technology) was established through an investment from the federal government to address unprecedented challenges in the semiconductor industry.42 NIST’s Advanced Technology Program was created to invest in research projects “that industry on its own could not fully support because of the technical risks involved, and often where timing is critical to eventual economic success in the highly competitive global market.”43 The Partnership for the Next Generation of Vehicles, or “Supercar” initiative, was a partnership between the U.S. government and three automobile manufacturers that sought to create a clean, safe, and affordable car with maximum fuel efficiency.44

To best follow these examples, a concentrated R&D effort in health system cybersecurity would need a specific and well-defined problem to address. The exact problem would be determined by the leaders initiative, but three areas of cybersecurity research should be considered:

  • Proactive insider threat detection systems. One good way to mitigate insider threats is to use AI to flag suspicious, anomalous accesses to patient data. Health systems should ideally have tools to automatically review and document on 100 percent of accesses.45
  • IoT medical device security platforms. As internet-connected medical devices become more prevalent, there must be platforms developed to assure that that they are not only safe from outside attackers but also resilient to component failures, natural disasters, and even collapses in critical infrastructure.
  • AI technology to augment the privacy and security workforce. Large gaps in the health cybersecurity workforce are in themselves existential threats to the security of health systems. AI tools can be used to mitigate this gap by training models to perform the most rote and mundane aspects of cybersecurity, such as data audits.

Recommendation #4.4: Create mechanisms for clarifying privacy standards, providing advice, and receiving feedback from health systems.

In a recent audit of 166 covered entities, OCR found that only one entity was compliant in their risk management process and none were compliant risk analysis processes.46 It is unlikely that this abysmal compliance rate is the result of solely of gross negligence in security by health systems. Healthcare providers may struggle to understand how to comply, especially with certain vaguely defined concepts (such as “security risk”), as providers themselves have suggested. At present, the main mechanisms through which OCR can give guidance are through fines and naming and shaming. These punitive measures may not constitute an actionable precedence for providers in the security measures they are expected to take. Instead, OCR should open up more channels through which it can provide insight to privacy officers.

One way to do this is through more prescriptive standards, but this may compromise OCR’s distant position as a regulatory body. A better alternative is to use OCR’s power as a convening body to gather a group of experts and stakeholders and have them give definitions and guidance for complying with HIPAA’s privacy and security rules. OCR has options for how to create more of a two-way conversation about its guidance: it could convene a regular meeting that may include privacy experts, CISOs, developers, healthcare professionals, and payers; it could hold Q&A conference calls or maintain a helpline for covered entities; or it could maintain more active listservs. OCR need not provide a safe harbor for health systems, only means of consulting.

Offering clear guidance does not preclude using punitive measures. OCR can take guidance from ONC, which has used its extensive regulatory authority over the certification of EHRs to improve their cybersecurity. The American Medical Informatics Association (AMIA) has asked ONC go even further in using this authority to promote security and interoperability measures that would allow for a sort of mass surveillance system to constantly monitor the security of EHRs (not to surveil patients though).47 OCR should not shy away from wielding its own authority to ensure the compliance with privacy-related HIPAA rules.

These options are not mutually exclusive, although the more hands on approaches may require hiring additional technical experts. If the OCR uses these levers to clarify the privacy standards, it could minimize healthcare provider violations, allow for shared auditing criteria across the health sector, and help CISOs and other privacy and security officers shift time away from mundane auditing tasks and towards meaningful, proactive privacy and security initiatives. An important standard for such work already exists in IRS letters of determination, which provide clarity and comfort to organizations who are innovative and thoughtful, but may still have honest questions on this front.

Recommendation #4.5: Strengthen FDA requirements around medical device security to ensure that security is baked-in at every point in the device’s life cycle.

To help address existing gaps in medical device guidance, the FDA should add an additional requirement that ensures device makers use an end-to-end secure system development lifecycle (SDLC). The sheer magnitude of medical device vulnerabilities, coupled with the worrisome reality that most medical device manufacturers and healthcare organizations fail to adequately test medical devices throughout their development lifecycle, underscores the importance of a renewed regulatory framework. Proper security testing must be ensured throughout the development lifecycle of a medical device. The FDA is well-placed to meet this challenge, with security policy support from agency partners at DHS.

Broadly speaking, the additional requirement would be geared towards implementing secure coding practices and identifying vulnerabilities within medical devices. More specifically, this would include cybersecurity testing best practices such as dynamic and static application analysis after each code change, and penetration testing, among other common cybersecurity measures. In the past, medical device manufacturers have operated with wide discretion over their cybersecurity assessments. According to the FDA, the reason for this loose regulatory standard is “because the regulation must apply to so many different types of devices.”48 The FDA would be wise to provide transitional support through trainings, public outreach, and site visits to help steer manufacturers towards a stronger standard for medical device cybersecurity.

Defining and regulating a secure SDLC process also prevents device manufacturers from “passing the buck” of cybersecurity to health system customers. As previously mentioned, cybersecurity works best when it is baked-in by manufacturers from the beginning and continually reevaluated throughout the development cycle. Health system customers can only wrap security around what has already been built, a method not only far less effective but also one that healthcare systems may not have the technical or workforce capacity to implement.

For example, imagine that an open source software package used in an EHR-connected patient monitoring device is found to have a vulnerability. If the manufacturer designed the device using a secure SDLC, they would be able to update the device remotely and communicate the update to customers. If not, they may have no way to provide a security patch, and the customer, even if they somehow found out about the vulnerability, could at best silo the compromised device from the rest of their network and reconfigure it to work without connecting to an EHR. Even this fix may not be sufficient, and it would certainly not scale to other organizations.

The FDA has already taken some steps towards regulating secure SDLCs. A recent draft guidance they issued includes a requirement for medical devices to provide a “cybersecurity bill of materials,” a list of hardware and software components that could potentially become vulnerable. The guidance also differentiates between devices with high security risk, like implanted devices or pacemakers, and standard security risk, like an EHR.49 These measures are good first steps, but in their current form they only affect devices that require pre-market approval, a class of devices that has been rapidly shrinking over the past few years.50

By issuing additional, more explicit guidance on required cybersecurity measures for all medical devices, the FDA can ensure device makers conduct effective security testing practices throughout the development lifecycle of a medical device. In turn, devices will be more secure when they hit the market and will remain secure even after they have been adopted by clinic.

Citations
  1. Protenus, Breach Barometer; Verizon, 2018 DBIR; Ponemon, Breach Study.
  2. This idea was floated as a “Cash for Clunkers” style program in the Cybersecurity Task Force, Report, 29. We would like to credit the idea of phasing out legacy technologies to the Task Force, but also provide an alternative pathway to achieving a meaningful transition. We also recommend using different language to describe this program since “Cash for Clunkers” was created during a specific political context that may be negatively perceived by some constituencies.
  3. I Am the Cavalry, Hippocratic Oath for Connected Medical Devices, January 19, 2016.
  4. Cybersecurity Task Force, Report, 1.
  5. Ibid., 28.
  6. The WannaCry ransomware attacks exploited vulnerabilities on devices running outdated or unpatched versions of an old Windows operating system, encrypting the system and demanding that the users of infected systems pay a ransom to regain control of their devices. Because a large number of medical devices ran this legacy version of Windows, a large number of these devices were affected, shutting down health systems around the world. According to a post-incident investigation conducted by England’s National Health Service (NHS), thousands of appointments were cancelled and medical care had to be triaged just to maintain emergency care services. Even still, five acute trusts, or hospital trusts that provide secondary care in the United Kingdom—including in London—were forced to divert emergency patients to other departments. Without the fortuitous intervention of a cybersecurity researcher who stumbled upon a “kill switch” that effectively stopped the spread of the bug, even more disruption would have occurred. Yet, according to the NHS report, “all organisations infected by WannaCry shared the same vulnerability and could have taken relatively simple action to protect themselves,” such as updating or patching known software flaws in their legacy devices that had been previously flagged by the national healthcare IT partner, NHS Digital.
  7. According to the Verizon 2015 DBIR, 99.9 percent of “exploited vulnerabilities had been compromised more than a year after the associated CVE [Common Vulnerabilities and Exposures report] was published.” Verizon, Verizon 2015 Data Breach Investigations Report (New York, NY: Verizon, Inc., 2015). (Hereafter: Verizon, 2015 DBIR).
  8. Ponemon Institute, Medical Device Security: An Industry Under Attack and Unprepared to Defend (Traverse City, MI: Ponemon Institute, 2017). (Hereafter: Ponemon, Medical Device Security).
  9. Ibid.
  10. Ibid.
  11. Symantec, Addressing Healthcare Cybersecurity Strategically (Mountain View, CA: Symantec Corporation); David B. Black, “Security Regulations vs. Cyber-security: The War,” Huffington Post, May 2, 2017; Amit Kulkarni, “Why HIPAA Compliance Does Not Equal Data Security,” Health IT Outcomes, August 9, 2016.
  12. U.S. Department of Homeland Security, Healthcare and Public Health Sector-Specific Plan, p.9, May 2016.
  13. Thanks to Matt Doan for this insight.
  14. Cybersecurity Task Force, Report.
  15. Office of the Inspector General U.S. Department of Health & Human Services, A Roadmap for New Physicians: Fraud & Abuse Laws, July 17, 2018.
  16. This observation and an initial set of recommendations was first brought to our attention by the excellent work of the June 2017 Health Care Cyber Security Task Force Report, which we will continue to build upon. See Cybersecurity Task Force, Report, 27.
  17. For example, static and dynamic application analysis, code review, and penetration testing.
  18. Ponemon, Medical Device Security.
  19. U.S. Food and Drug Administration, Quality System (QS) Regulation/Medical Device Good Manufacturing Practices, March 27, 2018. source. (Hereafter: FDA Medical Device Regulation)
  20. More specifically, the FDA’s cybersecurity guidance for premarket medical device development can be found in: U.S. Food and Drug Administration, Content of Premarket Submissions for Management of Cybersecurity in Medical Devices (Silver Spring, MD: Food and Drug Administration, 2014); and U.S. Food and Drug Administration, Postmarket Management of Cybersecurity in Medical Devices: Guidance for Food and Drug Administration Staff (Silver Spring, MD: Food and Drug Administration, 2016) (henceforth: U.S. Food and Drug Administration, “Postmarket Management.”).
  21. Elizabeth Snell, “How FDA Medical Device Cybersecurity Guidance Affects Providers,” HealthIT Security. source
  22. U.S. Food and Drug Administration, “Postmarket Management,” 13.
  23. Ibid.
  24. Cybersecurity experts will readily note that lack of testing is but one of many factors contributing to the proliferation of vulnerabilities. This example is given to show that regulations lag far behind even some of the most basic cybersecurity practices.
  25. For more, see Cybersecurity Task Force, Report, 30-31.
  26. Cybersecurity Task Force, Report, 28-29. The Report focuses primarily on actions that industry can take to phase out legacy technology, and also correctly calls for government action to “consider incentives, requirements, and/or guidelines for reporting and/or use of unsupported [legacy] system[s].” The Report also calls for “incentive recommendations to phase-out legacy and insecure health care technologies.” Our report goes one step further in providing an even more specific plan for how to achieve this phasing-out.
  27. Thanks to Josh Corman and Beau Woods from I Am The Cavalry for their tremendous help with these recommendations.
  28. U.S. Congress, House, Internet of Things (IoT) Cybersecurity Improvement Act of 2017, S.1691, 115th Cong., 1st sess., introduced in Senate August 1, 2017. source
  29. Ali Breland, “White House Unveils Report on Modernizing IT,” The Hill, December 13, 2017. source
  30. See, generally, Chapter 1 of this report in the section on “Vulnerabilities and Consequences.”
  31. For examples of such incubators, see: Neil Ainger, Barclays sign eight FinTech start-ups and spinoff ‘intrapreneur’ CNBC.com (CNBC website, May 4 2017, updated May 5 2017) source and JP Morgan Chase In Residence, (JPMC website, accessed Aug 25 2019) source
  32. For example of such an incubator, see: Tamaya Macheel, Goldman Sachs launches in-house incubator, Tearsheet, March 15, 2018 source
  33. Jon Oltsik, “What is a Cybersecurity Technology Platform Anyway?” CSO Online, April 27 2018, source
  34. National Cybersecurity Center of Excellence, Securing Electronic Health Records on Mobile Devices (Gaithersburg, MD: National Institute of Standards and Technology, 2015); National Cybersecurity Center of Excellence, Securing Picture Archiving and Communication System (PACS): Cybersecurity for the Healthcare Sector (Gaithersburg, MD: National Institute of Standards and Technology, 2017); National Cybersecurity Center of Excellence, Securing Wireless Infusion Pumps In Healthcare Delivery Organizations (Gaithersburg, MD: National Institute of Standards and Technology, 2017).
  35. Some useful information on existing research efforts can be found on the U.S. National Library of Medicine website (source).
  36. Charles W. Wessner, “Recommendations and Findings,” in The Small Business Innovation Research Program (SBIR): An Assessment of the Department of Defense Fast Track Initiative (Washington, DC: National Academy Press, 2000), 32.
  37. Ibid., “Preface”, 5.
  38. Ibid., “Introduction”, 27.
  39. Ibid., “Recommendations and Findings,” 33.
  40. Ibid., 35.
  41. For a more detailed explanation on how to calculate social rates of return, see the study by Link and Scott “Estimates of the Social Returns to Small Business Innovation Research Projects,” in Ibid., 275.
  42. The Department of Defense, through its “Defense Advanced Research Projects Agency” (DARPA), began funding SEMATECH in 1987. See more: DARPA, SEMATECH, July 17, 2018 source
  43. NIST, NIST Advanced Technology Program Launches 54 New Technology R&D Projects, October 4, 2000.
  44. National Research Council, Review of the Research program of the Partnership for a New Generation of Vehicles: Fourth Report (Washington, DC: The National Academies Press, 1998).
  45. In the interest of full disclosure, please note that this specific area, also known as “Healthcare Compliance Analytics” is the core business of Robert Lord, one of the co-authors of this paper
  46. These were the preliminary results from Phase 2 of the HIPAA Audit Program. Full report available at: Linda Sanches, “Update on Audits of Entity Compliance with the HIPAA Rules” (Washington DC, Office for Civil Rights (OCR), U.S. Department of Health and Human Services, September, 2017) source
  47. See the AMIA’s response to the ONC’s RFI concerning EHR reporting. Douglas B. Fridsma, President and CEO of the American Medical Informatics Association, Letter to The Honorable Donald Rucker, MD, National Coordinator for Health Information Technology, US Department of Health and Human Services (‘Re: Request for Information Regarding the 21st Century Cures Act Electronic Health Record Reporting Program’) dated October 17, 2018 (AMIA website, October 2018) source
  48. FDA Medical Device Regulation.
  49. FDA, “ Content of Premarket Submissions for Management of Cybersecurity in Medical Devices”, October 18, 2018. www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM623529
  50. Thanks to David Holtzman for helping articulate this point.

Table of Contents

Close