Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Posterior expectation of regularly paved random histograms
Sainudiin R., Teng G., Harlow J., Lee D. ACM Transactions on Modeling and Computer Simulation23 (1):1-20,2013.Type:Article
Date Reviewed: May 3 2013

Parametric and nonparametric supervised machine learning algorithms are useful in artificial intelligence (AI). The literature of the field includes illustrations of decision-tree-based classifiers for hyperplanes, higher-order dimensional grammars, and pattern recognition of characters and coloring [1]. But how should we design reliable and efficient statistical data structures for nonparametric density estimation of massive and multidimensional data for complex AI applications?

Sainudiin et al. present an effective statistical binary tree algorithm to first discern the split boxes with sides far apart, prior to pruning and building a stable histogram tree. The algorithm consists of a strategy for saving computer memory when computing the maximum likelihood of computationally extensive histograms for AI applications.

The authors perform simulation experiments to determine the accuracy of the algorithm for estimating the subsequent probability of recurrently smooth random histograms. The algorithm did not perfectly average the independent samples of subsequent multivariate structured and unstructured distributions. It was used to classify high-dimensional data from multivariate uniform, Gaussian, and Rosenbrock densities. The performance of the algorithm is remarkable. The authors creatively review the limitations of algorithms for estimating the densities of high-dimensional data in the literature. They cleverly introduce an integrated absolute error for gauging algorithms designed to fit simulated datasets from high-dimensional densities. Given that the reliability of the technique for investigating the posterior repeatedly covered random histograms was derived from simulated experimental results, all AI experts should weigh in on the practical applications of the algorithm in this paper.

Reviewer:  Amos Olagunju Review #: CR141199 (1308-0721)
1) Schalkoff, R. J. Pattern recognition: statistical, structural, and neural approaches. Wiley, New York, NY, 1992.
Bookmark and Share
  Reviewer Selected
Featured Reviewer
 
 
Trees (E.1 ... )
 
 
Statistical Computing (G.3 ... )
 
 
Trees (G.2.2 ... )
 
 
Graph Theory (G.2.2 )
 
 
Probability And Statistics (G.3 )
 
Would you recommend this review?
yes
no
Other reviews under "Trees": Date
The hB-tree: a multiattribute indexing method with good guaranteed performance
Lomet D., Salzberg B. ACM Transactions on Database Systems 15(3): 625-658, 1990. Type: Article
Jun 1 1991
Multidimensional trees
Baldwin W., Strawn G. Theoretical Computer Science 84(2): 293-311, 1991. Type: Article
Oct 1 1992
Hash trees versus b-trees
Bell D., Deen S. The Computer Journal 27(3): 218-224, 1984. Type: Article
Feb 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy