Encryption is key to medical data security

Even though the U.S. government wrote them, HIPAA's almost-final rules mandating the integrity, confidentiality, and availability of medical information are more complicated than they sound. For one thing, the rules describe the results of security measures rather than the systems required to support them. For another, computers and data networks are complex beasts that are notoriously easy to fool.

By all accounts, implementing the rules once they're finalized this fall won't be easy. Data security under the U.S. Health Insurance Portability and Accountability Act of 1996 (HIPAA) will require a vast network of interconnected and sometimes-overlapping security systems, with far-reaching implications for both healthcare providers and technology vendors.

Depending on how you interpret the rules, for example, the systems will probably need to protect all patient-identifiable information in transmission. They'll need to keep networks free of viruses, and denial-of-service hacker attacks. And they'll have to ensure that the right information is available when and where it's needed -- a crucial piece when patients' lives are at stake.

Translating the rules into data security will occur on at least seven different fronts, according to Steve Langer, Ph.D., of the University of Washington in Seattle, speaking at this year's Symposium for Computer Applications in Radiology in Philadelphia. These include:

  • Authentication -- Is the user who he or she claims to be? Aunthentication includes passwords, public/private keys, biometrics, public key infrastructure or trusted certificates, etc.
  • Authorization -- Does the user have the right to the particular information?
  • Integrity -- Is the data intact and unchanged?
  • Confidentiality -- Has the information been protected from viewing by unintended parties?
  • Nonrepudiation -- Can a user deny having sent a message?
  • Reliability -- Is the system available at all times?
  • Auditing -- Are you able to track who accessed what? Facilities that have celebrities as patients, for example, need to be able to track who is accessing their medical records.

Encryption

Encoding text so that it can be read only after being decoded by the intended recipient is essential in at least the first four security goals, Langer said, and the last three probably couldn't exist without it either. The codes, or algorithms, can be based on either public or private key methods, or both. Authentication is another key part of the picture.

In private key methods such as DES (Data Encryption Standard), both the sender and the recipient have the same algorithm, or key. The sender encodes the message and sends it; the recipient receives it and decodes it with the same key. But there are drawbacks.

"It's fine if you've met the person face-to-face, but it's doesn't scale well. A single secret between two people may last. Among three people, the chances decrease. Among 20 -- not a chance. So we have to look at alternatives," Langer said.

DES normally uses a 56-bit key to encrypt each 64-bit block of data, although that encryption level has been cracked, and a 128-bit key is now in common use. As computers grow more powerful, even that level won't last forever, he said.

Public key systems avoid the need to keep secrets by using two kinds of keys: a private key that is never shared, and a public key that is given away freely. To use it, Person A encrypts a message to Person B with B's public key, and sends it to B. Only B can read the message with his private key; even A can no longer read it. Thus, public keys can be posted on Web sites, and downloaded by anyone who wants to send a secure message to the key owner.

SSL, or Secure Socket Layer, is a type of public key scheme. Developed by Netscape, it's available in 40- or 128-bit versions. It works well between browser and server, but doesn't scale to other applications.

The IETF (Internet Engineering Task Force) then came up with a scheme called ipsec, with stands for IP Security Protocol. Relatively unknown until recently, ipsec lists a menu of available security schemes once users are connected and authorized, then chooses the most secure method of data transmission. The upcoming version 6 will be incorporated into many applications and widely used for transferring data on the Internet, Langer said.

PGP, or Pretty Good Privacy, is a free downloadable hybrid encryption method that uses a public key system for authentication in the header of a document, and a private key method to encrypt the remaining data. Since private-key methods require less computing time, or overhead, the method is faster and more efficient than pure public-key schemes, albeit slightly less secure, Langer said.

These security schemes have resulted in a wide range of applications, including encrypted file systems like TCF (Transparent Cryptographic File system). This method leaves the data encrypted in the storage media. Once the user logs on with proper authorization, the data is decrypted in RAM, and can be viewed as long as the user is logged on. However, the system requires a high level of trust between servers.

Security products such as Symantec's For Your Eyes Only are available for Windows-based systems. For Unix servers, SSH (Secure Shell) systems create an encrypted tunnel between systems at login, so that unencrypted information such as Telnet data is encrypted before it starts flowing, Langer said.

"The cool thing about Secure Shell is that you can direct other things to go through it. You can take a service, say your Web server that normally runs on port 80 ... and run it through Secure Shell on whatever port it happens to be, and then that enjoys an encrypted tunnel."

Hardware alternatives to software encryption schemes can eliminate the extra computing time needed to encrypt and decrypt information.

"Maybe you're in teleradiology, you've got a slow pipe, and you're already waiting 30-40 minutes for that MR piece to come across. Do you want to incur another 20%-30% slowdown due to software encryption?

If you don't, hardware products such as Intraport 2 can be attached to one end of the system, creating a type of firewall. The system requires anyone who reads the data to use DES software, but computer overhead is reduced to just one side of the communication network, Langer said.

"We've done some performance measurements with the system we use. Usually the overhead for transmitting a single MR case with single (56-bit) DES is something on the order of 20%-30%; it can be as bad as 50%-50% in triple (128-bit) DES," he said. "That's probably too much to ask people to put up with, and really it probably isn't necessary for casual teleradiology."

Other products combine firewalls with encryption, such as the WatchGuard SOHO and products from Intrusion.com, Langer said.

"These are the ones I tend to think people will find most useful," he said.

Confidentiality measures, such as digital signatures, are designed to prevent the interception of data in transmission. For example, Person A encrypts the message digest (MD) with a private key, and sends the MD value with the document.

If Person B intercepts the document and edits it, the MD changes. However, Person B can't read or change the privately encrypted MD value. So when the intended Recipient C gets the message, decrypts the MD value and recalculates MD for the current document, he can tell it has been altered, Langer said.

The "once-used secret" method authenticates users to a device. Both communicating machines may have synchronized random number generators, that change the authenticating number every minute. Once a user authenticates, the number is never used again.

An alternative to single-use authentication is the smart card, which works in secure or non-secure networks. It requires the user to authenticate to the card itself, using biometrics or a traditional method such as a PIN code. The user's authentication code is never transmitted across the network; the code merely activates the card, which then transmits its own private-key token to the server. The server then sends the token along with the public key to the receiving machine for decryption.

Kerberos, developed at MIT in the late 1980s, can securely connect a user or software agent to multiple hosts with a single authentication, saving a lot of time and effort. It works by authenticating the agent to a network, creating an encrypted tunnel, then issuing "tickets" to other multiple hosts based on access control lists.

The Electronic Frontier Foundation and CERT offer continually updated information on data security issues.

By Eric Barnes
AuntMinnie.com staff writer
September 19, 2000

Click here to view all active PACS Digital Community discussions

Copyright © 2000 AuntMinnie.com

Page 1 of 254
Next Page