What Developers Need to Know about HIPAA Compliance in Wearable Tech

By Morgan Brown/ Published on May 14, 2014

With dozens of products already on the market and more on the way, it’s clear that wearable tech is only going to grow in popularity with consumers. From Fitbit to Jawbone Up, Nike Fuel Band and more, these devices are tracking more consumer health data than ever. While popular wearables are tracking steps and calories today, it’s likely that they will track things like hydration, heart rate and more in the next few months—especially if rumors about Apple’s Healthbook are true.

It’s no surprise then that consumers and healthcare professionals alike see the potential in sharing this data with one another in order to better manage patient care. With constant collection and the ability to connect and share information via Bluetooth or via the Web to other systems, wearables promise an infinitely easier way to monitor patients than the current state of patient journaling in activity logs.

However, there’s a big gap in the legal requirements between health data collected for a consumer’s personal use and that used as part of a relationship with a HIPAA covered entity such as a doctor. Consumer health data stored on a device for a consumer’s personal use isn’t subject to HIPAA compliance rules; but as soon as that information is potentially part of an exchange with a doctor or other healthcare provider, the data on the device and stored as part of your application fall under HIPAA regulations.

If you’re building software for wearable tech and plan to make sharing that data with healthcare providers possible, it’s essential that you understand HIPAA laws in order to ensure compliance before bringing that app or product to market.

What Is HIPAA?

HIPAA stands for the Federal Health Insurance Portability and Accountability Act of 1996. The purpose of this law, according to the United States Department of Health and Human Services, is to ensure confidentiality of all healthcare information, to help ensure people are able to get and keep insurance, and to keep spending for administrative costs under control.

The main thing that developers need to understand is the security portion of these laws, because information potentially transmitted by your application to a covered entity such as a doctor or insurance provider is covered by HIPAA.

In January 2013, HIPAA was updated via the Final Omnibus Rule. Within this update, there are two things that affect developers of wearable tech and various mobile devices directly:

  • The first is that software developers who build applications that track, store and share healthcare information with covered entities are now required to be HIPAA compliant and meet the standards laid out in the HIPAA Security Rule, which includes the Administrative, Technical and Physical Safeguard requirements for health-related data.

  • The second is a change in the definition of a privacy breach. According to this new addendum, it is up to business associates (any party that handles private health information) such as an application developer, hosting provider, or a company like TrueVault, to determine whether or not something actually has to be reported as a breach. For example, if a wearable containing healthcare data was hacked, healthcare information that was exposed in the hack would require the reporting of a breach. However, if a device is hacked but the information stored on the device is not decrypted, then there would be no breach violation.

Due to the heavy fines and other sanctions allowed under HIPAA—and the simple fact that a person’s personal health information should remain private and secure—understanding the requirements of the Security Rules is crucial to ensure HIPAA compliance for the applications you develop for any wearable.

HIPAA Privacy Requirements

Before you start building, you want to have a good handle on what does and does not constitute HIPAA compliance in your technical and physical safeguards. For example, HIPAA compliant hosting will satisfy physical safeguards but not technical safeguard requirements. Noncompliance on these laws can come with up to $50,000 in penalties depending on the amount of privacy (aka data) that was lost.

The privacy details that developers need to always use include the following:

  • All data should be protected by passwords and user authentication methods.
  • Encryption must be used to protect data.
  • There must be a way to remotely wipe or disable the data.
  • File sharing should not be included on the device.
  • Firewalls should always be put in place and enabled properly.
  • All devices should have security software that is regularly updated.

You can learn more about the specific requirements for each element in this checklist for HIPAA compliance for developers.

Understanding Protected Health Information

Certain health details are considered protected health information (PHI), while other data collected by wearables is not considered covered by HIPAA. Things like number of heartbeats in a given time, number of steps a person takes, or a person’s sleep history are not technically considered PHI and would not fall under the parameters of HIPAA. However, and this is where it can get confusing, if the data is transferred in any way to a medical professional, including hospitals, doctors, and 3rd party companies in the course of providing a healthcare service, such as a diagnosis or treatment, then it automatically is covered by HIPAA because it is then considered a part of the patient’s health records. Of course, any of the wearable tech used specifically in the medical field to monitor patients is most certainly covered by HIPAA laws.

Consent

Most wearable tech at this time doesn’t even acknowledge HIPAA. However, as these devices become more popular, the demand to use this information in patient health management will increase. If you’re developing for wearable technology and your software and the data collected have a realistic potential of becoming part of the patient record you want to decide upfront if you should develop in a HIPAA compliant environment.

One additional thing to consider, a consent of some type might make sense during the first installation or use of your app. The consent simply needs to state that the user of the wearable tech is giving full consent for the data recorded by the device to someone else. This way, the software developer has a record of the user opting in to providing this data as part of using the software.

As wearable tech becomes more popular, there is a good chance that the US Department of Health and Human Services, as well as governing entities in Canada and the UK, will start issuing more guidance around the collection and sharing of wearable-collected data to ensure it conforms to the privacy and protection standards outlined in the HIPAA rules.

A HIPAA violation can come with a high price tag, so it’s critical that as a developer you work with your product team to determine whether you need to be HIPAA compliant or not, and then implement the proper administrative, technical and physical safeguards to comply with the law if you ultimately decide that you do.

Of course, if you use TrueVault as part of your build, we take care of the technical and physical safeguard requirements for you, enabling you to check those boxes and get on with the development of your software.

Additional Reading:

US Department of Health and Human Services
Why the New HIPAA Is Good for Mobile Health Developers

Latest Posts

Should Utah's Privacy Law Be on Your Radar?

A Cookie Banner Isn't Enough for CCPA Compliance

Why CCPA Compliance Matters to HR

Mailing List