Health is becoming personal, predictive, and preventive through advanced technologies – wearable devices, embedded sensors, artificially intelligent robots, and virtual reality headsets. A deluge of data and feedback generated by these technologies nudge consumers to engage in healthier activities, or are aggregated and analyzed for insights about diverse populations across geographies. Major technology companies are investing in solutions powered by “big data” that promise to improve the health of populations worldwide. The opportunities appear boundless.

Despite this promise, ethical, legal, and social concerns associated with these technologies have emerged, which could very well hinder benefits to health. The US federal government has targeted several health technology companies that are unable to support their scientific claims with compelling evidence, and studies demonstrate that insufficient privacy and security features underlying such technologies can lead to harmful effects for users. If these challenges are not proactively mitigated, the potential improvements to health may not be realized at scale.

Overcoming these issues requires the collective views of disparate stakeholders and cross-sector collaboration. One voice is not as powerful as multiple in unison. As a start, colleagues from Vitality, Microsoft, and the Qualcomm Institute at the University of California, San Diego published an open-access, peer-reviewed commentary that called for a public consultation to identify best practices to eliminate ethical, legal, and social barriers to health technologies. For 90 days in 2015, a wide range of stakeholders offered input on a draft set of guidelines for the responsible innovation of health technology and the appropriate stewardship of data from these devices. Feedback came from organizations such as the EU Commission, the US Food and Drug Administration, the National Academy of Medicine, and the American Heart Association.

In March 2016, Vitality released the finalized guidelines for personalized health technology. They included five recommendations:

  • Build health technologies informed by science: Integrate scientific and behavioral evidence into the design of health technologies to better understand health risks and outcomes.
  • Scale affordable health technologies: Develop cost effective health technologies that are accessible by all populations to minimize health inequalities.
  • Guide interpretation of health data: Facilitate interpretation of health data through software design to support better health literacy.
  • Protect and secure health data: Embed privacy and security design features into health technology to ensure end-to-end protection.
  • Govern the responsible use of health technology and data: Disclose practices associated with the governance of health technologies and data to create shared values for all stakeholders.

The guidelines provide the foundation for a working group to pilot the implementation of the guidelines. These will be measured independently using tangible metrics, and results will be shared. Collaborating across sectors, the proposed guidelines seek to shift the dialogue around health technologies to one that promotes shared values for all stakeholders. They are an attempt to convene leading industry players to consider bringing greater transparency and accountability to health technology and data—to avert the sorts of issues that recently emerged between the US Federal Bureau of Investigation and Apple. The guidelines are not an attempt to preempt government regulation, but aim to fill holes where needed in existing regulatory frameworks.

Can we learn from the past to know if we are on track? The Human Genome Project (HGP) is one example where proactive consideration of ethical, legal, and social concerns led to broader individual and societal benefits. Twenty-five years ago, the HGP was founded as an international research collaboration to sequence human genes. Leaders of the HGP set aside a portion of the budget to foster basic and applied research on these issues, and established the Ethical, Legal, and Social Implications Research Program. Today, the National Human Genome Research Institute (NHGRI) at the US National Institutes of Health has a legislative mandate to allocate no less than 5% of the NHGRI budget to these issues. As a consequence, established and accepted protocols facilitate the routine sharing of genetic data for research. The vision for our guidelines is informed by past achievements in proactive investigation of concerns with the possession of genetic information.

Technologies are created by people, for people. Technologies that improve the public’s health should be informed by science, affordable, safe and protect the user’s health data. We can collaborate to shape the future of this new frontier in health data, or we can wait in anticipation and uncertainty only to discover the unintended consequences.

Gillian Christie is a Health Innovation Analyst at The Vitality Group in New York City.

Kevin Patrick is a Professor of Family Medicine and Public Health, and a researcher at the Qualcomm Institute at the University of California, San Diego in La Jolla, California.

Chris Calitz is Director of the Center for Workplace Health Research and Evaluation at the American Heart Association in Dallas, Texas.