Course number 59415

Fall 2001

Meeting: MWF, 9:00-9:50 am.

Bus. Adm./Math Building #250

San Diego State University

Other times: by appointment or good fortune (I will normally be in my office and available).

David Applebaum,

Other Sources:

Chung, Kai Lai,

Grimmet, G. R. And Stirzaker D. R.

Mansuripur, Masud,

Hankerson, D., Harris G. Johnson, P.,

First Assignment: due Fri. Sept. 14.

Second Assignment: due Wed. Sept. 26

Third Assignment (ps file) , (pdf file) due Wed. 10/3

Fourth Assignment: due Fri. Oct. 12.

Fifth Assignment: due Mon. Oct. 19.

Sixth Assignment: due Mon. Oct. 29.

Seventh Assignment (ps file) , (pdf file) due Wed. 11/21

Eigth Assignment (ps file) , (pdf file) due Wed. 11/28

Ninth Assignment (ps file) , (pdf file) due Fri. 12/14

What is

- Given a ``stream of data,'' design a method to store it compactly.
- Suppose that the stream of data is subject to occasional errors in transmission. Design a coding system to correct the errors.
- Suppose an eavesdropper is able to copy the transmitted data. Design an encrypting system to make the data unreadable to the eavesdropper.

I have presented these questions as practical ones by saying ``Design a method.'' Shannon was mainly concerned with the more fundamental question ``What is the best that we could possibly do?'' For example:

- Suppose we would like to store novels---written in English---on a computer. What is the best compression ratio we might hope to achieve?
- Suppose we transmit a sequence of bits (each bit is 0 or 1) to a satellite, and on average one in every 100 bits is incorrectly received. What is the best possible coding scheme to reduce the error rate to an acceptable level?

In this course we deal primarily with the first question above, which is the easiest to answer. We will prove Shannon's Noisless Coding Theorem and other fundamental results about data compression. We will also look at several compression algorithms. The climax of the course is the rather difficult Noisy Coding Theorem which answers the second question above in a theoretical fashion. We will not discuss algorithms for error correction, which is the subject of Math 525: Coding Theory. We will say a little about the theoretical answer to the third question. We will not discuss techniques for information security, which is touched on in Math 522: Number Theory, and covered extensively in Math 626: Cryptography.

Information theory is intimately related to probability theory, so we will develop the basics of discrete probability theory: counting principles, joint and conditional distributions, Bayes' theorem, expected value of a random variable. We will also develop the main theorems of probability theory, the Law of Large Numbers and the Central Limit Theorem, since they are required to establish the Noisy Coding Theorem.

We will have regular assignments and a final exam. For the assignments, there will be a small number of problems (10 or so) which you should write up carefully. There will also be opportunities to program data compression algorithms for credit.

Weekly work | 650 |

Final | 350 |

Total | 1000 |