CDAR2_QRDA_R1D1_2008SEPT

Implementation Guide For CDA Release 2
Levels 1, 2 and 3
Quality Reporting Document Architecture (QRDA)

Based on HL7 CDA Release 2.0

(US Realm)

Draft Standard for Trial use

First Ballot

September 2008

© 2008 Health Level Seven, Inc.
Ann Arbor, MI
All rights reserved.

Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Chair/Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Co-Editor: / [Name]
[Organization or Company]
[E-mail]
Current Working Group also includes: / [Name, Name, Name, Name, Name]


Acknowledgments

This Guide was produced and developed through the efforts of the Quality Reporting Document Architecture (QRDA) Project supported by The Child Health Corporation of America (CHCA) to develop and support a standard for quality reporting.. Through this project, CHCA supports the HL7 Pediatric Data Standards Special Interest Group and others in developing a draft standard for reporting quality measure data.

The QRDA committee made up of representatives from AHIC, AHIMA, IHE, CHCA, the Collaborative, Medallies and NHINs was instrumental in guiding the project so that alignment occurred between the organizations interested and working on various aspects of quality reporting occurred.

The co-editors would also like to express their appreciation for the support and sponsorship of the Structured Documents Working Group and the Pediatric Data Standards Special Interest Group.

Finally, we would like to acknowledge the foundational work on Health Level Seven (HL7) Version 3 and the Reference Information Model (RIM), the HL7 domain committees, especially Patient Care, and the work done on CDA itself.

We would also like to acknowledge the collaborative effort of ASTM and HL7, which produced the Continuity of Care Document (CCD). All these efforts were critical ingredients to the development of this DSTU and the degree to which it reflects these efforts will foster interoperability across the spectrum of health care.

Revision History (to be removed prior to putting up for ballot)

Rev / Date / By Whom / Changes / Note
New / May 21, 2008 / Gay Giannone


Table of Contents

1 Introduction 8

1.1 Purpose 8

1.2 Scope 8

1.2.1 Background 8

1.2.2 Current Project 9

1.3 Audience 9

1.4 Process of Formalizing a Measure 9

1.4.1 Role of Professional Societies 10

1.4.2 Role of Technical Groups 11

1.4.3 EHRs and Quality Reporting 11

1.5 Definition of a Quality Measure 13

1.5.1 Types of Quality Measure Reports 13

1.5.2 Measure Set 14

1.6 Approach 14

1.6.1 Organization of this Guide 14

1.6.2 Use of Templates 15

1.7 Conventions Used in This Guide 16

1.7.1 Explanatory Statements 16

1.7.2 Conformance Requirements 16

1.7.3 Vocabulary Conformance 17

1.7.4 XPath Notation 17

1.7.5 Keywords 17

1.7.6 XML Samples 18

1.7.7 Contents of the Ballot Package 18

2 category one QRDA CDA – for ballot 19

2.1 Header Constraints 19

2.1.1 Header attributes 19

2.1.2 Participants (Do we want a participant scenario chart here?) 19

2.2 Category One Body Constraints 22

2.3 QRDA Category One Section Constraints 24

2.3.1 Measure Set Section Conformance 24

2.3.2 Measure Section Conformance 25

2.3.3 Reporting Parameters Section Conformance 26

2.3.4 Patient Data Section Conformance 27

3 category two qrda cda – for comment 30

3.1 Header Constraints 30

3.1.1 Header attributes 30

3.1.2 Participants 30

3.1.3 Header relationships 33

3.2 Category Two Body Constraints 33

3.3 QRDA Category Two Section Constraints 34

3.3.1 Required Section 34

3.3.2 Entry Patterns 34

4 category three qrda cda – for comment 35

4.1 Header Constraints 35

4.1.1 Header attributes 35

4.1.2 Participants 35

4.1.3 Header relationships 37

4.2 Category Three Body Constraints 38

4.3 Section Constraints 38

4.3.1 Required Section 38

4.3.2 Entry Patterns 39

5 Neonatal admission temperature qrda implementation guide – for ballot 40

5.1 Introduction and Purpose 40

5.2 Rational for choosing Neonatal Admission Temp Measure 40

5.3 Measure Information 41

5.4 NEONATAL ADMISSION TEMPERATURE QRDA – Category One Header Additional Constraints 42

5.4.1 Header Attributes 42

5.5 NEONATAL ADMISSION TEMPERATURE QRDA – Category One Additional Body Constraints 43

5.6 NEONATAL ADMISSION TEMPERATURE QRDA – Category One Additional Section Constraints 43

5.6.1 Measure Section Conformance 43

5.6.2 Reporting Parameters Section 43

5.6.3 Patient Data Section 43

6 pediatric Body mass Index qrda implementation guide- for ballot 46

6.1 Introduction and Purpose 46

6.2 Rational for choosing Body Mass Index (BMI) Measure 46

6.3 Measure Information 46

6.4 BMI QRDA – Category One Header Constraints 48

6.4.1 Header Attributes 48

6.4.2 Participants 48

6.5 BMI QRDA – Category One Body Constraints 48

6.6 BMI QRDA – Category One Section Constraints 48

6.6.1 Measure Section Conformance 48

6.6.2 Reporting Parameters 48

6.6.3 Patient Data Section 48

7 References 49

Appendix A — Template IDs defined in this Guide 50

8 Open issues 51

Table of Figures

Figure 1 Fundamental Steps in Quality Measure Development and Reporting Definition 10

Figure 2: clinicalDocument example 18

Figure 3: realmCode Category One example 19

Figure 4: clinicalDocument/templateId Category One example 19

Figure 5: recordTarget Category One Example 20

Figure 6: assignedAuthor Category One Example 20

Figure 7: Informant - Category One Example 21

Figure 8: Custodian Category One Example 21

Figure 9: legalAuthenticator Category One Example 22

Figure 10 Example of rendered QRDA Category One Report 23

Figure 11: Measure Set Section Example 24

Figure 12 MeasureAct example 26

Figure 13 Reporting Parameters TimeElement example 27

Figure 14: realmCode Category Two example 30

Figure 15: Null flavor recordTarget example 31

Figure 16: AssignedAuthor as a processing entity example 31

Figure 17 Informant Category Two example 32

Figure 18: Custodian Category Two Example 32

Figure 19 legalAuthenticator Category Two example 33

Figure 20: documentationOf Category Two Example 33

Figure 21: realmCode Category Three example 35

Figure 22: Null flavor recordTarget example 36

Figure 23: AssignedAuthor as a processing entity example 36

Figure 24 Informant Category Three example 36

Figure 25: Custodian Category Three Example 37

Figure 26: legalAuthenticator Category Three example 37

Figure 27 documentationOf Category Three Example 38

Table of Tables

Table 1: Contents of the Ballot Package 18

Table 2: Template IDs Defined in this Guide 50

1  Introduction

1.1  Purpose

The IOM definition of quality is, “The degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” [1] In order for knowledge about care quality to be evaluated it must be gathered and communicated to the appropriate organizations.

The purpose of this document is to describe constraints on CDA Header and Body elements for Quality Reporting Documents. Quality Reporting Document Architecture (QRDA) is a document format that will provide a standard structure with which to report quality measures to organizations that will analyze and interpret the data that is received. Measuring quality in health care is complex. Accurate, interpretable data, efficiently gathered and communicated is key to correctly assess for quality care delivered.

1.2  Scope

1.2.1  Background

HL7 Quality Reporting Document Architecture (QRDA) Project[2]

The HL7 QRDA Project aims to develop standard specifications for communicating relevant information that will be used for improving the quality of healthcare. Healthcare institutions routinely collect and report performance measure data to improve the quality of care provided to patients. Current data collection and reporting activities rely on a variety of mechanisms that range from structured paper to electronic data entry formats – usually derived from claims-based data sets or manual data abstraction. The HL7 Pediatric Data Standards Special Interest Group (PeDSSIG) pioneered the QRDA initiative with funding for Phase I from the Alliance for Pediatric Quality.[3] The initiative is aimed at developing an EHR-compatible standard for distributing data related to patient-level quality measures across disparate healthcare IT systems. Participating organizations are dedicated to the belief that such a standard will make it easier to support the analysis and tracking of healthcare quality, decrease the reporting burden for providers and improve the quality of data used for measurement.

In the first phase of the QRDA initiative, participating organizations confirmed the feasibility of using the HL7 Clinical Document Architecture (CDA) as the foundation for the QRDA specification. It was concluded that CDA, a document markup standard that defines the structure and semantics of clinically-relevant documents for healthcare information exchange across EMRs, can provide the technical underpinnings for communicating pediatric and adult quality measures for both inpatient and ambulatory care settings. The project team developed sample QRDA instances from an adult use case developed for the CMS Doctor Office Quality–Information Technology (DOQ-IT) initiative (defined as an HL7 Version 2.4 messaging specification), and a sample pediatric quality measure from the Joint Commission Pediatric Asthma Measures.

1.2.2  Current Project

The current Phase II project was to develop a QRDA Implementation Guide and other materials needed for the September 2008 HL7 ballot that could make QRDA a Draft Standard for Trial Use (DSTU). This effort is supported by the Child Health Corporation of America (CHCA) and MedAllies. The QRDA DSTU aims to, as its initial output; define three (3) levels or categories of quality reporting within this DSTU (see 1.5.1 Types of Quality Measure Reports.) The section of the DSTU that defines the Category One Report is ballotable, while the sections of the DSTU that define Category Two and Three are for comment only.

The QRDA initiative is compatible with parallel industry efforts and organizations that are addressing the quality landscape, including the American Health Information Community (AHIC), Healthcare Information Technology Standards Panel (HITSP) and Integrating the Healthcare Enterprise (IHE). The intent of QRDA is not to define the logic of the measure as applied within an EHR but to model the measure in CDA format.

The goal of is to nationally standardize the framework of quality reports and to define the way the data about measures are structured to create interoperability between reporting and receiving institutions.

1.3  Audience

The audience for this document includes software developers and consultants responsible for implementation of reporting capabilities with their Electronic Health Record (EHR) systems, and developers and analysts in receiving institutions and local, regional, and national health information exchange networks who wish to create and/or process CDA reporting documents created according to this specification.

1.4  Process of Formalizing a Measure

Ideally, a process is in place whereby the measure is formally specified, in a process that involves domain experts and a computable representation. From there, one could theoretically auto-generate the QRDA Category I, II, and III specifications. However, it may be that in some cases, the development of a QRDA specification will come before there is an agreed upon formal representation of a particular Measure. In these cases, care must be taken to have a planned collaberation process between the domain experts and the measure representation designers get consences that the intent is captured and the output is useful.

Measuring clinical performance is recognized as an important tool for improving the quality of patient care. Health care institutions routinely collect and report performance measure data in an effort to monitor and assess the quality of care provided to their patients. Performance measures are developed, promoted and maintained by institutions concerned about health care quality. In most cases, performance or quality measures are often developed and promoted by governmental, public and private organizations, medical specialties, and are frequently backed by the academe. Currently, data used for determining performance are often derived from manual paper chart abstraction and administrative datasets and have well-known limitations in accuracy, time and use of resources.

A number of the recently established population-based quality measures were developed from evidence-based clinical guidelines. These quality measures have gone through consensus among domain experts and are often considered as essential and pertinent to the health problem at hand. Established measures are not only supported by the rigor of scientific evidence but they are also deemed practical to implement. These publicly available measures typically represent common diseases that inflict the population such as asthma, myocardial infarction, heart failure, pneumonia, diabetes, and hypertension, to name a few. These measures are used for quality improvement, research and often for accountability purposes (e.g., pay-for-reporting, pay-for-performance).

Figure 1 Fundamental Steps in Quality Measure Development and Reporting Definition

* Measure Development Organization

** Quality Improvement organization

1.4.1  Role of Professional Societies

Professional societies such as the AMA, CHCA etc should engage with technical implementers and/or and Standards Development Organizations (SDO’s) to formalize knowledge representation of quality measures. The following bullet points highlight important aspects of the professional society’s roles.

·  Provide explicit and unambiguous measure description

·  Provide uniformity in defining and categorizing measure specifications

·  Promote standardization of naming conventions for quality measures

·  Provide formalized processes for measure development and maintenance

·  Promote the use of existing ontologies to represent data elements

·  Formalize and provide explicit representation of measure logic, algorithm, and relevant computations (i.e., age, time, exclusions, inclusions, etc)

·  Understand the breadth of data captured by EHRs in relation to current and future quality measures; some measures may need to be revised or improved due to availability of more appropriate information

·  Follow existing standards for representing measure parameters (HITEP, HITSP, etc)

·  Pay attention to advances in EHR capability. EHRs contain a rich source of clinical data that can be utilized for quality measurements and consider using standard clinical terminologies such as SNOMED CT in addition to or in place of administrative coding to capture this clinical data.

1.4.2  Role of Technical Groups

Technical groups must engage with professional society measure developers to properly represent measure description in computable format. The following bullet points highlight important aspects of the technical group roles.