The decision by Department of Veterans Affairs CIO Roger Baker to allow users to connect mobile devices such as the Apple iPad and iPhone to the agency's computing network provides a good example of the trade-off many organizations face between security, user desires, and practical business considerations, but also illustrates the subjectivity inherent in security management decisions and the authority delegated to federal agency executives to apply their own risk tolerance to decisions. Baker was quoted in an article by Nextgov in July acknowledging the fact that the security software is not FIPS certified, but indicating that he is willing to accept the risk associated with the decision to allow the devices to be used anyway, with the assumption that even without FIPS certification the encryption technology is sufficient to provide the needed protection. While the VA prepares for broader support for mobile devices this fall, it is operating a pilot program with Apple devices. Baker is participating in the pilot, and according to FederalTimes.com has traded his own laptop for an iPad.
From a security standpoint, the VA's plan to allow agency-issued and personal mobile devices to access Departmental networks is most noteworthy because the devices in question do not yet satisfy federal standards for encryption. This is a particularly sensitive issue for the VA, which has a checkered history when it comes to data breaches, including the well-publicized 2006 theft of a VA laptop containing unencrypted records on some 26.5 million veterans. To be fair, Apple devices do offer encryption capabilities, but the software used to do so is not certified compliant with Federal Information Processing Standard (FIPS) 140-2, and so fails to satisfy federal security requirements for cryptographic modules. Apple is currently in the process of validating its cryptographic modules for both the iPhone and iPad through the National Institute of Standards and Technology's Cryptographic Module Validation program. According to NIST's "Modules in Process" list both Apple modules are in the first phase of the process, called "implementation under test," meaning Apple has a testing contract in place with a cryptographic security and testing lab and has provided the module and all required documentation to the lab. While still in the early stages of certification, this progress may give the VA and other agencies some degree of confidence that FIPS certification is pending, making the risk associated with running un-certified security a temporary issue.
The fact that VA can independently make the decision to essentially waive a federal technology standard reflects the authority that most federal agencies have under current law and policy. The majority of agencies are self-accrediting when it comes to determining the appropriate security measures to put in place to adequately protect enterprise information and other assets. Federal agencies are expected to apply risk-based decision-making to security management practices, and since the authority rests with each agency, decision makers evaluate the risk to the organization from the use of a given system or technology against the benefits offered and the cost of implementing security safeguards. Different organizations (and different decision makers within those organizations) have different appetites for risk, so what may be acceptable to one agency would be unacceptable to another. In the VA's case, it seems likely that Baker is not demonstrating especially high risk tolerance, but instead that the perceived risk of using encryption that has not yet achieved FIPS certification is not high enough to preclude the use of mobile computing devices in health care delivery settings.
Among the potential downsides of using encryption that doesn't have FIPS 140-2 certification is in the area of breach notification. The new federal health data breach notification and disclosure requirements, which went into effect in September 2009 under the authority of an interim final rule, exempt organizations from having to disclose data breaches if the data is "unusable, unreadable, or otherwise indecipherable to unauthorized
individuals," which HHS declared to mean that the data has either been encrypted or destroyed. The FIPS 140-2 requirements apply whenever government regulations call for the use security based on cryptographic modules, so the practical interpretation of HHS' breach disclosure exemption for encrypted data is that such encryption must use FIPS 140-2 certified cryptography. In theory this means if the VA loses one of its newly network-connected iPads with protected health information on it, it would have to report the breach even if the device had encryption enabled. Practically speaking, VA already reports many types of data breaches to Congress and to the public, and does so to comply with requirements of the Veterans Benefits, Health Care, and Information Technology Act of 2006 (Pub. L. No. 109-461), so the new health data breach rules stemming from the HITECH Act are in many ways redundant to existing practice.
Friday Squid Blogging: *Cephalopod Cognition*
3 hours ago