Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Input-dependent neural network trained by real-coded genetic algorithm and its industrial applications
Ling S., Leung F., Lam H. Soft Computing11 (11):1033-1052,2007.Type:Article
Date Reviewed: Jan 11 2008

Neural networks (NNs) map input to output using neuron elements inspired by the human nervous system. Conventionally, the parameters of the networks are fixed after the training process, independent of the input data.

Have you ever thought of a NN that has various parameters adapting to changes of the input environment? This paper presents an input-dependent neural network (IDNN) with different parameters for different input data sets. Assuming that input data covers large domains, IDNN can perform as individual NNs deal with several subdomains of input data. The paper gives a good example, which shows the advantages of IDNN. IDNN has two units: network memory (NM) and data processing (DP) neural network. The NM stores parameters governing how DP neural networks handle the input data, which can capture the characteristics of the input environment (while traditional NNs cannot).

IDNN exhibits a better learning and generalization ability than traditional NNs. It operates as if different individual NNs handle different input data sets. The authors also propose an improved training method called real-coded genetic algorithm (RCGA) to estimate the parameters of the network. For applications involving large domains of input-output mappings, IDNN performs the mapping task more accurately. Experimental results show that IDNN achieves better results than conventional feed-forward fully connected neural networks (FFCNNs) and wavelet neural networks (WNNs) for short-term load forecasting and handwritten graffiti recognition.

This paper is recommended to readers interested in performing mapping tasks for NNs that have a large number of parameters more efficiently and accurately. It does, however, require good background knowledge of NNs and parameter estimation methods.

I noticed a couple of small mistakes, which I would like to point out: the reference formats are inconsistent. For example, on page 1037, RCGA is referenced as both (Ling and Leung 2007) and Ling and Leung [2007]. Also, on page 1041, in Figure 8, the name of the proposed method should be IDNN instead of VPNN.

Reviewer:  Jie Tang Review #: CR135096 (0811-1119)
Bookmark and Share
 
Pattern Analysis (I.5.2 ... )
 
 
Heuristic Methods (I.2.8 ... )
 
 
Neural Nets (I.5.1 ... )
 
 
Text Processing (I.5.4 ... )
 
 
Applications (I.5.4 )
 
 
Models (I.5.1 )
 
  more  
Would you recommend this review?
yes
no
Other reviews under "Pattern Analysis": Date
Understanding data pattern processing
Inmon W., Osterfelt S., QED Information Sciences, Inc., Wellesley, MA, 1991. Type: Book (9780894353864)
Jun 1 1992
Parallel thinning with two-subiteration algorithms
Guo Z., Hall R. Communications of the ACM 32(3): 359-373, 1989. Type: Article
Jan 1 1990
A variable window approach to early vision
Boykov Y., Veksler O., Zabith R. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(12): 1283-1294, 1998. Type: Article
Oct 1 1999
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy