With all the talk about the need for effective security measures to protect personal health data stored in electronic health records and shared among organizations participating in health information exchanges, the decision of what actual security and privacy controls an organization puts in place remains highly subjective and therefore likely to vary greatly among health care entities. This is neither a new nor a particularly surprising problem in health information security given the structure of the laws and regulations that set requirements for security and privacy provisions, but in some ways the lack of more robust security requirements (and complete absence of privacy requirements) in the administration's final rules on EHR incentives under "meaningful use" represent a lost opportunity. The security-related meaningful use measures and associated standards and certification criteria for EHR systems provide another instance of federal rules promulgated under the authority of the Health Information Technology for Clinical and Economic Health (HITECH) Act that, as implemented, fall somewhat short of the vision articulated in the law.
Where security and privacy laws are concerned, Congress has always shown reluctance to mandate specific security measures or technologies, in part to avoid favoring any particular technology or market sector or vendor, and also because the authors of such legislation correctly assume that they may lack the technical expertise necessary to identify the most appropriate solutions, and instead choose to delegate that task to NIST or other authorities. The net result however is sets of "recommended" or "addressable" security safeguards or, in the case of explicitly required security controls, endorsing a risk-based approach to implementing security that allows organizations to choose not to put some controls in place with appropriate justifications for those decisions. There's nothing inherently wrong with this approach — it embodies fundamental economic principles about security, particularly including the idea that it doesn't make sense to allocate more resources to securing information and systems than what those assets are worth. The problem lies in the reality that different health care organizations will value their information assets in different ways, will face different threats and corresponding risks to those assets, and will have different tolerances for risk that drive what is "acceptable" and what isn't, and similarly drive decisions about what security measures to implement and which to leave out.
From a practical standpoint, what might be helpful to build confidence in the security of health IT such as EHR systems would be a set of minimum standards for security that all organizations would need to implement. The HIPAA Security Rule includes a large number of administrative, physical, and technical safeguards (45 CFR §§164.308, 164.310, and 164.312, respectively), but many of the "required" safeguards are described in sufficient vague terms that compliance is possible with widely varying levels of actual security, and many of the most obviously helpful safeguards, like encryption, are "addressable" and therefore not required at all. There were relatively few security standards and criteria included for meaningful use stage 1, and most of the items that were included already appear somewhere in the HIPAA security rule, but what stands out about the standards and criteria is how little specificity they contain. The minor revisions to these security items in the final rules issued late last month should make it fairly easy for organizations to satisfy the measures, but will have little impact in terms of making EHR systems or the health care organizations that use them more secure. The only identifiable "standards" included are government Federal Information Processing Standards (FIPS) for encryption strength (FIPS 140-2) and for secure hashing (FIPS 180-3), while everything else is described in functional terms that leave the details to the vendor providing the EHR system or the entity doing the implementation. Even the risk analysis requirement (the only explicit security measure in meaningful use) was reduced in scope between the interim and final versions of the rules, as under meaningful use the required risk analysis only needs to address the certified EHR technology the organization implements, not the organization overall. This is markedly less than what is already required of HIPAA-covered entities (and, under HITECH, of business associates as well) under the risk analysis provision of the HIPAA Security Rule.
The Fundamental Insecurity of USB
4 hours ago