The following computations are set forth because of their relevance to recent advanced multi-antenna techniques [1]. The presentation follows that given by Cover and Thomas [2]. Details on multivariate distributions can be found in Hogg and Craig [3] and in detailed computations given below [4, 5]. The joint differential entropy of random variables satisfies

The p.d.f. (probability density function) of jointly normal real random variables is given by

where and are length column vectors, is the covariance matrix and is its determinant. In this case,

Since the expectation of a sum is the sum of the expectations

or

Since

this becomes

But the covariance and its inverse are necessarily real symmetric matrices, so

the desired result. Additional related derivations are provided in [4,5] and are available by clicking below.

**References**

[1] M. Debbah in A. Sibille, C. Oestges, A. Zanella, (editors) *MIMO From Theory to Implementation*, Academic Press, Amsterdam (2011), Chap. 1.

[2] T. M. Cover and J. A. Thomas, *Elements of Information Theory*, Wiley, N.Y. (1991), p. 230.

[3] R. V. Hogg and A.T. Craig, *Introduction to Mathematical Statistics*, Macmillan, N.Y. (1978), Chap. 12.

[4] H. L. Rappaport, *Normal and Bivariate Normal Distributions and Moment-Generating Functions*, 7G Communications, 7GCTN03, September (2014). bivar

[5] H. L. Rappaport, *Multivariate Distributions and Associated Differential Entropy of Jointly Normal Random Variables*, 7G Communications, 7GCTN04, October (2014). multi

### Like this:

Like Loading...

Tags: technology

This entry was posted on October 6, 2014 at 4:18 pm and is filed under Information Theory. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

## Leave a Reply