Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
A methodology to evaluate important dimensions of information quality in systems
Todoran I., Lecornu L., Khenchaf A., Le Caillec J. Journal of Data and Information Quality6 (2):1-23,2015.Type:Article
Date Reviewed: Aug 5 2015

As the world begins to surf down the “trough of disillusionment” with big data [1], on the journey to sorting out what can be real and what has value, we rediscover the discipline that data and information scientists have been nurturing for quite some time: assessing the quality of what has been created and displayed. This domain of interest is front and center the minute you successfully integrate a variety of data sources, enriching and producing a novel set of analytics that was never before possible. After stepping back and marveling at this creation, someone rightful asks, “How do we measure the quality of the information we are working with?” This is at the heart of the paper.

The challenge with data quality frameworks is that they tend to not be practical. Todoran et al. highlight this, and it would seem that, in practice, these frameworks are often not needed for the simplicity of what many systems are aspiring to do. However, enter the world of high velocity, high volume, and high variety, and you actually need to understand entropy and resulting quality--this is where Todoran et al. are brilliant. Even if the specific examples are not relevant to you, they offer a three-step framework that is portable to approaching your specific situation. They have included tables for quality criteria and their measures for data and information, all of which provoke critical thinking about how to assess data quality as it flows through a system. One could even argue that their formula, well documented and exemplified, is not required in order to structure an information quality measurement strategy.

If you are involved in a new or existing data mashup, where it is not enough to have just any answer but an answer that comes with statistical transparency, the authors’ methodology will prove useful at a variety of levels.

Reviewer:  Brian D. Goodman Review #: CR143670 (1510-0889)
1) Buytendijk, F. Hype cycle for big data, 2014. Gartner. https://www.gartner.com/doc/2814517/hype-cycle-big-data- (Accessed 07/26/2015).
Bookmark and Share
 
Process Metrics (D.2.8 ... )
 
 
Performance Evaluation (Efficiency And Effectiveness) (H.3.4 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Process Metrics": Date
A Vector-Based Approach to Software Size Measurement and Effort Estimation
Hastings T., Sajeev A. IEEE Transactions on Software Engineering 27(4): 337-350, 2001. Type: Article
Feb 1 2002
Assessing uncertainty of software development effort estimates: the learning from outcome feedback
Gruschke T., Jorgensen M.  Software metrics (Proceedings of the 11th IEEE International Software Metrics Symposium (METRICS’05), Sep 19-22, 2005)42005. Type: Proceedings
Jan 4 2006
A study of the influence of coverage on the relationship between static and dynamic coupling metrics
Mitchell Á., Power J. Science of Computer Programming 59(1-2): 4-25, 2006. Type: Article
Oct 13 2006
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy