Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Deming least-squares fit to multiple hyperplanes
Moniot R. Applied Numerical Mathematics59 (1):135-150,2009.Type:Article
Date Reviewed: Jun 5 2009

Have you tried Deming-fit regression on experimental data yet? How about for a regression model involving multiple dependent and independent variables?

Deming regression differs from conventional regression in that the independent variables--for example, Xi--are assumed to have variance. Determining the variability of Xi(s) can have implications. For instance, you may have to repeat an experimental run where each run occurs at a unique combination of the levels of Xi. Depending on the problem domain, such replication can be costly or even impossible.

The author presents the mathematics behind a method for solving a Deming-fit regression that can include multiple dependent and independent variables. The method produces the predictive equations, relating each dependent variable with the independent variable(s) simultaneously.

In conventional regression, goodness of fit is indicated by a statistic named R2. We include or remove variables to/from the model to move this number to 1.0. A good R2 number is close (less than or equal) to 1.0. In this paper, the goodness of fit stat for Deming-fit regression is a chi square statistic. This stat should also be close to 1.0, but may indicate a good fit with a value of .7 to 1.3. If you have a look on the Web at tools that perform Deming regression, you’ll find that you have to provide the standard deviation of both the Y(s) and the X(s). Of course, if you’re running the same set of experiments over and over again, you may indeed know these numbers. But what if you are performing the analysis for the first time? You’ll have to run experiments to get these standard deviations or obtain the standard deviations from some other source.

According to the Web, Deming-fit regression produces a better predictive equation when Y is not highly correlated with X (less than .8), as compared to conventional least squares. This is definitely an advantage. However, as previously mentioned, you need to know the variance of the model’s variables. I’m not sure that the lack of knowledge of variance negates the benefit of Deming regression, but it might.

Reviewer:  Dick Brodine Review #: CR136912 (1002-0180)
Bookmark and Share
 
Least Squares Methods (G.1.6 ... )
 
 
Correlation And Regression Analysis (G.3 ... )
 
 
Least Squares Approximation (G.1.2 ... )
 
 
Numerical Algorithms (G.1.0 ... )
 
 
Uncertainty, “Fuzzy,” And Probabilistic Reasoning (I.2.3 ... )
 
 
Applications (G.1.10 )
 
  more  
Would you recommend this review?
yes
no
Other reviews under "Least Squares Methods": Date
On computational aspects of bounded linear least squares problems
Dax A. ACM Transactions on Mathematical Software 17(1): 64-73, 1991. Type: Article
Feb 1 1992
An FPGA-based parallel architecture for on-line parameter estimation using the RLS identification algorithm
Ananthan T., Vaidyan M. Microprocessors & Microsystems 38(5): 496-508, 2014. Type: Article
May 19 2015

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy