One way to understand the source coding theorem [1, 2] is to first consider a general discrete source with an output alphabet of symbols. It would require bits to transmit a given symbol. Similarly, for a block of sequential ordered elements of the alphabet there would be possible messages that can be transmitted in one block. It would thus require bits to send a message of symbols.

Next specialize to the case of a discrete memoryless source also with an output alphabet of symbols. Let the member of the alphabet be denoted and let the probability of occurrence of be denoted Now the block of symbols where is very large is considered. Let the expected number of occurrences of be denoted . We have that for very large

That is to say for very large the number of occurrences of each of the is known and the number of * likely *messages is not but instead the number of distinct permutations of a block of symbols with

So the number of possible messages that can be sent in one block of symbols is

Therefore, the number of bits required to send these messages or, equivalently, the number of bits needed to indicate to a receiver which of these messages was transmitted would be

and

Using the approximation [3] shows

Or

where is the entropy of the alphabet.

Of course *more* bits can always be used to code all of these different messages but this presents the minimum needed when is very large. The minimum required average number of bits per transmitted symbol is denoted and thus satisfies

This result is the source coding theorem [1, 2]. Of course we have not shown that uniquely decodable codes exist that operate on one symbol at a time and can obtain this limit. However, we have shown that this limit exists.

I have provided [4] a more detailed write-up introducing information theory available by clicking below.

**References**

[1] S. Haykin, *Digital Communications*, Wiley, N.Y. (1988), Chapter 2.

[2] C.E. Shannon, *A Mathematical Theory of Communication*, Bell System Technical Journal (1948).

[3] M. Abramowitz and I. Stegun, *Handbook of Mathematical Functions*, Dover, N.Y. (1965), pp. 255 – 257.

[4] H.L. Rappaport, *Notes on Information Theory I and the Source Coding Theorem, *7G Communications*, *7GCTN01 (2014); infoI

### Like this:

Like Loading...

Tags: technology

This entry was posted on October 3, 2014 at 2:42 pm and is filed under Information Theory. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

## Leave a Reply