Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
A different approach to finding the capacity of a Gaussian vector channel
Tsybakov B. Problems of Information Transmission42 (3):183-196,2006.Type:Article
Date Reviewed: Oct 15 2007

The capacity of a Gaussian multiple-input, multiple-output vector channel is considered in this paper. The problem of finding the capacity of a Gaussian channel has been previously studied; however, this paper follows a different approach to find the channel capacity. In the previous study, the capacity is defined as an upper bound on the transmission rate, below which the input information is transmitted through the channel only with a small error probability. When the transmission rate goes beyond the upper bound, the error probability becomes unacceptable for meaningful transmission. In this paper, the capacity is defined as the maximum of the mutual information rate between the channel input and output.

Strictly speaking, the definition of the channel capacity adopted in this paper is similar to the definition of the channel capacity adopted in another referenced paper. When the error probability is small, the input information and the output information are very similar. Hence, the mutual information between the channel input and output is close to the self-information (that is, the entropy) of the channel input. When X and Y denote the channel input and output, respectively, the mutual information between X and Y is denoted by I(X;Y) and is expressed as H(X) - H(X|Y) (H(X) is the self-information of the channel input X, and H(X|Y) is the conditional entropy of X given the knowledge on output Y). It is easy to see that H(X|Y) is small when X and Y have similar probability distributions. Correspondingly, I(X;Y) is made close to H(X) in this case.

However, the approach of deriving the channel capacity adopted in this paper does inspire a new analysis method for the calculation of channel capacity. Finding the channel capacity is translated into an optimization problem with power constraint. The optimization is performed using the Lagrange function. Moreover, the analysis on the capacity of the vector channel is well elaborated to demonstrate how the capacity of individual input is optimized. Overall, the analysis approach in this paper is helpful and inspiring for other analysis.

Reviewer:  Jun Liu Review #: CR134832 (0808-0788)
Bookmark and Share
 
Formal Models Of Communication (E.4 ... )
 
 
Computations On Discrete Structures (F.2.2 ... )
 
 
Computations On Matrices (F.2.1 ... )
 
 
Information Theory (H.1.1 ... )
 
 
Nonnumerical Algorithms And Problems (F.2.2 )
 
 
Numerical Algorithms And Problems (F.2.1 )
 
  more  
Would you recommend this review?
yes
no
Other reviews under "Formal Models Of Communication": Date
Statistical and inductive inference by minimum message length (Information Science & Statistics)
Wallace C., Springer-Verlag New York, Inc., Secaucus, NJ, 2005.  432, Type: Book (9780387237954)
Jan 17 2006
Stochastic recovery problem
Darkhovsky B. Problems of Information Transmission 44(4): 303-314, 2008. Type: Article
May 21 2009
Universal semantic communication
Juba B., Springer Publishing Company, Incorporated, New York, NY, 2011.  416, Type: Book (978-3-642232-96-1)
Jul 13 2012
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy