Nov 30, 2016
Big data initiatives hold great promise for healthcare. The ability to collect, analyze and share clinical data between providers can help achieve a multitude of reform goals, from improving patient experiences and outcomes to enabling individuals and families to make better treatment decisions. Healthcare providers also need smart analytics to operate more efficiently, reduce waste and improve margins as value-based care initiatives proliferate.
Before any of this can happen, however, providers must tackle the data quality issue that has been brewing for years. This stems from the use of many disparate systems and processes within facilities and within the larger ecosystem of providers. These systems are crucial for managing patient care in a geographic area; yet the lack of integration between them means that patient data is incomplete, inaccurate and misinterpreted. That leads to medical errors and incomplete information—all harmful to a patient’s well-being and recovery.
For years, healthcare providers have attempted to create and maintain rich, centralized repositories for clinical and patient data, with the electronic health record (EHR) system as the linchpin. For those that have successfully implemented an EHR, the challenges grow over time. There are just too many systems containing data on a single individual—and developing interfaces to all those systems is costly and technically challenging. For example, a system receiving an EHR might not be able to locate data in a specific format or field from the system sending it, even though these missing data fields may be required to integrate information between records.
EHRs also suffer from data quality issues related to human error and process inconsistencies, which include:
- Inaccurate patient identifiers, such as a missing Social Security number or misspelled names, leading to duplicate patient records or the blending of multiple individuals into one record.
- Metrics that don’t end in the appropriate structured fields.
- Incorrect entry of diagnosis codes.
- Missing reports, such as lab and radiology.
A majority of hospitals have an EHR system that has been federally tested and certified for the government’s incentive program, according to the Office of the National Coordinator for Health Information Technology (ONC), but there is still much work to be done. Only 9% of providers were fully compliant with ONC’s 2015 EHR certification with products that enable open APIs, according to an eHealth Initiative survey.
Steps to Improving Data Quality
Just like treating an aggressive cancer, tackling the data quality problem requires multiple strategies. Efforts currently underway among organizations include:
- Improving existing standards by implementing a set of national interoperability standards for information systems.
- Working toward developing an enterprise master patient index at the institutional level.
- Participating in public and private healthcare information exchanges, which studies show can facilitate faster service to patients and reduce costs by avoiding preliminary treatments such as tests.
- Implementing data validation technologies into EHR systems to verify the accuracy, meaningfulness and security of data.
- Developing easy-to-use information portals where consumers can access their health records and provide feedback on missing or inaccurate data.
Explore more about how to overcome challenges to interoperability of patient data in our white paper.