Material Detail

Data Quality

Data Quality

This video was recorded at 7th International Symposium on Intelligent Data Analysis, Ljubljana 2007. In industry the term „Quality" used in the context of "Quality Control or Assurance" of products - and later services - has a history of about one hundred years. It is used in an ISO norm as "Suitability for use relative to a given objective of usage". Looking at "Products" and "Processes" one distinguishes between "Quality of Design" and "Quality of Performance". "Data Quality" is a term which is used at Statistical Offices and supranational Organizations (OECD, UN NAGroup etc.) for about the same time. It became popular in computer science twenty years ago, when data quality problems related to data warehousing, ETL, data cleansing , data mining and data integration were detected. Data Quality is mostly defined as above, i.e. fitness for use given an objective of data processing on a specific domain. For example, the objective may be web-mining where semi-structured data is to be integrated. Evidently, the term "data quality" has many various facets. Stepwise refining the granularity starting from several data sources to a single value of an attribute (variable) one can differ between multi-sources or data bases, single databases (on the schema or data level), records and values. For instance, on the data level errors, outliers, null-values (missing values), inconsistent (incoherent) values or simply semantic misuse of data are of concern while on the schema level integrity constraints may be violated. All these factors may lead to low data quality.


  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material


Log in to participate in the discussions or sign up if you are not already a MERLOT member.