An Introduction to Duplicate Detection
2 authors - Paperback
£22.99
Felix Naumann studied mathematics, economy, and computer sciences at the University of Technology in Berlin. After receiving his diploma in 1997 he joined the graduate school "Distributed Information Systems" at Humboldt University of Berlin. He completed his Ph.D. thesis on "Quality-driven Query Answering" in 2000. In 2001 and 2002 he worked at the IBM Almaden Research Center on topics around data integration. From 2003-2006 he was an assistant professor of information integration at the Humboldt University of Berlin. Since 2006 he has held the chair for information systems at the Hasso Plattner Institute at the University of Potsdam in Germany. He is Editor-in-Chief of the Information Systems journal. His research interests are in the areas of information integration, data quality, data cleansing, text extraction, and-of course-data profiling. He has given numerous invited talks and tutorials on the topic of the book. Melanie Herschel finished her studies of information technology at the University of Cooperative Education in Stuttgart in 2003. She then joined the data integration group at the Humboldt University of Berlin (2003–2006), and continued her research on data cleansing and data integration at the Hasso Plattner Institute at the University of Potsdam in Germany (2006–2008). She completed her Ph.D. thesis on “Duplicate Detection in XML Data” in 2007. In 2008, she worked at the IBM Almaden Research Center, concentrating her research on data provenance. Since 2009, she pursues research on data provenance and query analysis at the database systems group at the University of Tübingen in Germany. Besides her publications and invited talks on duplicate detection and data cleansing, Melanie Herschel has also been a member of several program committees and has chaired and organized a workshop on data quality.