Measuring Data Quality for Ongoing Improvement. The Morgan Kaufmann Series on Business Intelligence

  • ID: 2237656
  • Book
  • 376 Pages
  • Elsevier Science and Technology
1 of 4

The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies.

  • Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges
  • Enables discussions between business and IT with a non-technical vocabulary for data quality measurement
  • Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation

Please Note: This is an On Demand product, delivery may take up to 11 working days after payment has been received.

READ MORE
Note: Product cover images may vary from those shown
2 of 4
Section One: Concepts and Definitions

Chapter 1: Data

Chapter 2: Data, People, and Systems

Chapter 3: Data Management, Models, and Metadata

Chapter 4: Data Quality and Measurement

Section Two: DQAF Concepts and Measurement Types

Chapter 5: DQAF Concepts

Chapter 6: DQAF Measurement Types

Section Three: Data Assessment Scenarios

Chapter 7: Initial Data Assessment

Chapter 8 Assessment in Data Quality Improvement Projects

Chapter 9: Ongoing Measurement 

Section Four: Applying the DQAF to Data Requirements

Chapter 10: Requirements, Risk, Criticality

Chapter 11: Asking Questions

Section Five: A Strategic Approach to Data Quality

Chapter 12: Data Quality Strategy

Chapter 13: Quality Improvement and Data Quality

Chapter 14: Directives for Data Quality Strategy

Section Six: The DQAF in Depth

Chapter 15: Functions of Measurement: Collection, Calculation, Comparison

Chapter 16: Features of the DQAF Measurement Logical

Chapter 17: Facets of the DQAF Measurement Types

Appendix A: Measuring the Value of Data

Appendix B: Data Quality Dimensions

Appendix C: Completeness, Consistency, and Integrity of the Data Model

Appendix D: Prediction, Error, and Shewhart's lost disciple, Kristo Ivanov

Glossary

Bibliography
Note: Product cover images may vary from those shown
3 of 4

Loading
LOADING...

4 of 4
Sebastian-Coleman, Laura
Laura Sebastian-Coleman, a data quality architect at Optum Insight, has worked on data quality in large health care data warehouses since 2003. Optum Insight specializes in improving the performance of the health system by providing analytics, technology and consulting services. Laura has implemented data quality metrics and reporting, launched and facilitated Optum Insight's Data Quality Community, contributed to data consumer training programs, and has led efforts to establish data standards and manage metadata. In 2009, she led a group of analysts from Optum and UnitedHealth Group in developing the original Data Quality Assessment Framework (DQAF) which is the basis for Measuring Data Quality for Ongoing Improvement.

An active professional, Laura has delivered papers at MIT's Information Quality Conferences and at conferences sponsored by the International Association for Information and Data Quality (IAIDQ) and the Data Governance Organization (DGO). From 2009-2010, she served as IAIDQ's Director of Member Services.

Before joining Optum Insight, she spent eight years in internal communications and information technology roles in the commercial insurance industry. She holds the IQCP (Information Quality Certified Professional) designation from IAIDQ, a Certificate in Information Quality from MIT, a B.A. in English and History from Franklin & Marshall College, and Ph.D. in English Literature from the University of Rochester (NY).

Note: Product cover images may vary from those shown
5 of 4
Note: Product cover images may vary from those shown
Adroll
adroll