This book offers a practical understanding of issues involved in improving data quality through editing, imputation, and record linkage. The first part of the book deals with methods and models, focusing on the Fellegi-Holt edit-imputation model, the Little-Rubin multiple-imputation scheme, and the Fellegi-Sunter record linkage model. The second part presents case studies in which these techniques are applied in a variety of areas, including mortgage guarantee insurance, medical, biomedical, highway safety, and social insurance as well as the construction of list frames and administrative lists. This book offers a mixture of practical advice, mathematical rigor, management insight and philosophy.
This book helps practitioners gain a deeper understanding, at an applied level, of the issues involved in improving data quality through editing, imputation, and record linkage. The first part of the book deals with methods and models. Here, we focus on the Fellegi-Holt edit-imputation model, the Little-Rubin multiple-imputation scheme, and the Fellegi-Sunter record linkage model. Brief examples are included to show how these techniques work.
In the second part of the book, the authors present real-world case studies in which one or more of these techniques are used. They cover a wide variety of application areas. These include mortgage guarantee insurance, medical, biomedical, highway safety, and social insurance as well as the construction of list frames and administrative lists.
Readers will find this book a mixture of practical advice, mathematical rigor, management insight and philosophy. The long list of references at the end of the book enables readers to delve more deeply into the subjects discussed here. The authors also discuss the software that has been developed to apply the techniques described in our text.
Thomas N. Herzog, Ph.D., ASA is the Chief Actuary at the U.S. Department of Housing and Urban Development. He holds a Ph.D. in mathematics from the University of Maryland and is also an Associate of the Society of Actuaries. He is the author or co-author of books on Credibility Theory, Monte Carlo Methods, and Models for Quantifying Risk.
Fritz J. Scheuren, Ph.D., is a Vice President for Statistics with the National Opinion Research Center at the University of Chicago. He has a Ph.D. in statistics from the George Washington University. He is much published with over 300 papers and monographs. He is the 100th President of the American Statistical Association and a Fellow of both the American Statistical Association and the American Association for the Advancement of Science.
William E. Winkler, Ph.D., is Principal Researcher at the U.S. Census Bureau. He holds a Ph.D. in probability theory from Ohio State University and is a Fellow of the American Statistical Association. He has more than 130 papers in areas such as automated record linkage and data quality. He is the author or co-author of eight generalized software systems, some of which are used for production in the largest survey and administrative-list situations.
From the reviews:
"Data Quality and Record Linkage Techniques is a landmark publication that will facilitate the work of actuaries and other statistical professionals." Douglas C. Borton for The Actuarial Digest
"This book is intended as a primer on editing, imputation and record linkage for analysts who are responsible for the quality of large databases. ? The book provides an extended bibliography with references ? . The examples given in the book can be valuable for organizations responsible for the quality of databases, in particular when these databases are constructed by linking several different data sources." (T. de Waal, Kwantitatieve Methoden, October, 2007)
"Tom Herzog has a history of writing books...that most mathematically literate people believe they already understand pretty well--until they read the book....This book...[is] interesting and informative. Anyone who works with large databases should read it." (Bruce D. Schoebel, Contingencies, Jan/Feb 2008)
"Who should read this book? The short answer is everyone who is concerned about data quality and what can be done to improve it. Buy a copy for yourself; buy another copy for your IT support." (Kevin Pledge, CompAct, October 2007)
"Data Quality and Record Linkage Techniques is one of the few books on data quality and record linkage that try to cover and discuss the possible errors in different types of data in practical situations. ? The intended audience consists of actuaries, economists, statisticians and computer scientists. ? This is a good short book for an overview of data quality problems and record linkage techniques. ? Statisticians, data analysts and indeed anyone who is going to collect data should first read this book ? ." (Waqas Ahmed Malik and Antony Unwin, Psychometrika, Vol. 73 (1), 2008)
"This book covers two related and important topics: data quality and recordlinkage. ? case studies are the book's major strength; they contain a treasure trove of useful guidelines and tips. For that reason, the book is an excellent purchase for practitioners in business, government, and research settings who plan to undertake major data collection or record linkage efforts. ? serves as a stand-alone resource on record linkage techniques. ? The book is aimed squarely at practitioners." (Jerome Reiter, Journal of the American Statistical Association, Vol. 103 (482), 2008)
"The book provides a good, sound, verbal introduction and summary, and a useful point of departure into the more technical side of database quality and record linkage problems. In summary, it should be a core sourcebook for non-mathematical statisticians in official statistics agencies, and database designers and managers in government and commerce. It also provides a useful introduction to this important topic, and a comprehensive reference list for further study, for professional statisticians and academics." (Stephan Haslett, International Statistical Reviews, Vol. 76 (2), 2008)