Work with federal information systems? Responsible for risk management, continuous monitoring, or FISMA compliance? Check out my book: FISMA and the Risk Management Framework.

Friday, August 6, 2010

Despite emphasis on risk analysis, health IT security won't change much under meaningful use

With all the talk about the need for effective security measures to protect personal health data stored in electronic health records and shared among organizations participating in health information exchanges, the decision of what actual security and privacy controls an organization puts in place remains highly subjective and therefore likely to vary greatly among health care entities. This is neither a new nor a particularly surprising problem in health information security given the structure of the laws and regulations that set requirements for security and privacy provisions, but in some ways the lack of more robust security requirements (and complete absence of privacy requirements) in the administration's final rules on EHR incentives under "meaningful use" represent a lost opportunity. The security-related meaningful use measures and associated standards and certification criteria for EHR systems provide another instance of federal rules promulgated under the authority of the Health Information Technology for Clinical and Economic Health (HITECH) Act that, as implemented, fall somewhat short of the vision articulated in the law.

Where security and privacy laws are concerned, Congress has always shown reluctance to mandate specific security measures or technologies, in part to avoid favoring any particular technology or market sector or vendor, and also because the authors of such legislation correctly assume that they may lack the technical expertise necessary to identify the most appropriate solutions, and instead choose to delegate that task to NIST or other authorities. The net result however is sets of "recommended" or "addressable" security safeguards or, in the case of explicitly required security controls, endorsing a risk-based approach to implementing security that allows organizations to choose not to put some controls in place with appropriate justifications for those decisions. There's nothing inherently wrong with this approach — it embodies fundamental economic principles about security, particularly including the idea that it doesn't make sense to allocate more resources to securing information and systems than what those assets are worth. The problem lies in the reality that different health care organizations will value their information assets in different ways, will face different threats and corresponding risks to those assets, and will have different tolerances for risk that drive what is "acceptable" and what isn't, and similarly drive decisions about what security measures to implement and which to leave out.

From a practical standpoint, what might be helpful to build confidence in the security of health IT such as EHR systems would be a set of minimum standards for security that all organizations would need to implement. The HIPAA Security Rule includes a large number of administrative, physical, and technical safeguards (45 CFR §§164.308, 164.310, and 164.312, respectively), but many of the "required" safeguards are described in sufficient vague terms that compliance is possible with widely varying levels of actual security, and many of the most obviously helpful safeguards, like encryption, are "addressable" and therefore not required at all. There were relatively few security standards and criteria included for meaningful use stage 1, and most of the items that were included already appear somewhere in the HIPAA security rule, but what stands out about the standards and criteria is how little specificity they contain. The minor revisions to these security items in the final rules issued late last month should make it fairly easy for organizations to satisfy the measures, but will have little impact in terms of making EHR systems or the health care organizations that use them more secure. The only identifiable "standards" included are government Federal Information Processing Standards (FIPS) for encryption strength (FIPS 140-2) and for secure hashing (FIPS 180-3), while everything else is described in functional terms that leave the details to the vendor providing the EHR system or the entity doing the implementation. Even the risk analysis requirement (the only explicit security measure in meaningful use) was reduced in scope between the interim and final versions of the rules, as under meaningful use the required risk analysis only needs to address the certified EHR technology the organization implements, not the organization overall. This is markedly less than what is already required of HIPAA-covered entities (and, under HITECH, of business associates as well) under the risk analysis provision of the HIPAA Security Rule.

2 comments:

  1. Steve,

    Well said. The hard part is defining a set of security functionality that really is needed. I was involved in the CCHIT criteria, co-chair at the time. We brought in a full set of security functionality. It worked quite well for years. I have been frustrated that this was not continued. Even if they were to point to NIST rather than invent themselves.

    However there is a security 'capability' that they did add that seems to be taking on a life of its own. That is the encryption for data-at-rest. The regulation text is rather unclear on this requirement, the comments are more clear. But it is NIST interpretation for test steps that seems to have gone off the rails.

    What is your read of the requirement for an EHR to have the capability to encrypt data-at-rest? Is this a criteria applied to exported data-sets, portable devices, or the whole EHR?

    ReplyDelete
  2. My interpretation of the language regarding general encryption and decryption capabilities is that the ability the EHR would need to have (whether or not it is used by the implementing organization) is to encrypt the data stored in the EHR - essentially within the database. For data transmitted, copied, backed up, or exported from the EHR, I believe the relevant standard is encryption when exchanging information. Since all the criteria and standards apply to certified EHR technology, the answer to your question would seem to be "all of the above."

    ReplyDelete