Data Normalization: Help for
Quality Measures Reporting
By Cheryl Mason, MSHI
BY NOW, MOST health information management (HIM) professionals are familiar with national quality measures and the
role they play in value-based care. Accuracy in reporting is paramount as the Centers for Medicare and Medicaid Services (CMS)
continues to elevate the relationship between these calculations
and a healthcare organization’s bottom line and reputation.
Now in its third year, the Medicare Access and CHIP Reauthorization Act (MACRA) provides a framework for accountability and
transparency. The goal is to drive improvement in the quality of
care given to Medicare recipients, promote interoperability, drive
performance improvement initiatives, and assess the cost of care.
For Merit-Based Incentive Payment System (MIPS) participants,
the framework is designed around four categories of reporting requirements: quality ( 45 percent), promoting interoperability ( 25
percent), process improvement ( 15 percent), and cost containment ( 15 percent). These measurement activities help stakeholders quantify processes, outcomes, and patient satisfaction as the
industry strives for improved population health, better patient
experiences, and lower costs. Organizations that are opting for
alternative payment models (APMs) take on more risk and have a
slightly different framework. In either case, having accurate data
is essential to successfully reporting to CMS.
Because data for quality measures reporting is collected in
a variety of ways, such as insurance claims, electronic health
records (EHRs), and registries, healthcare organizations must
have systems in place that ensure complete and accurate aggregation of information. Yet, the reality is that many organizations
struggle with the basics of data management and find they are
running up against an unseen challenge: data silos.
Information needed for accurate quality measures reporting
often remains “locked” within EHRs and other disparate systems
due to inconsistent technology and documentation requirements.
Some of the data required for quality reporting is not codified to
any standard and is documented using local vernacular or simply
found only in unstructured text. Consequently, providers and pay-
ers often fail to accurately aggregate the data needed for a given
quality measure and risk reimbursement losses or reputational
consequences due to the appearance of lower care quality.
The solution to this conundrum is a “single source of truth” that
ensures data coming from disparate systems is normalized to an
industry standard for meaningful sharing. Despite broader industry efforts to address clean information sharing through technology, standards, and even incentives, barriers still exist. As industry
initiatives continue to prioritize the shift to patient-centered care,
it becomes more urgent that providers deploy systems that ensure
data collection and sharing is accurate, timely, and consistent.
Providers must leverage data normalization strategies that clean
and map disparate patient information to achieve this end.
Quality Measures Basics
Patient cohorts—a group of patients sharing specific charac-
teristics—form the basis of quality measures. For instance, a
heart failure cohort may include such patient characteristics
as ejection fraction values, lab tests such as B-type natriuretic
peptide, or problem list entries. Healthcare organizations need
a method of codifying vast volumes of patient data to accurate-
ly identify and extract patients with these characteristics.
Accurate data aggregation is no easy feat for the average
healthcare organization. Consider, for example, the complexi-
ties of identifying all patients for a single and relatively straight-
forward measure: MIPS measure 021, “Perioperative Care: Se-
lection of Prophyactic Antibiotic – First OR Second-Generation
Working Smart a professional practice forum