Math 696: Special Topics
Course number 59415
Meeting: MWF, 9:00-9:50 am.
Bus. Adm./Math Building #250
San Diego State University
Professor: Mike O'Sullivan
Office: Bus. Adm./Math Building #217, ext. 594-6697
Office Hours: MWF 10:00-10:30, 12:00- 14:00.
Other times: by appointment or good fortune (I will normally be in my office and
David Applebaum, Probability and Information: an
integrated approach, Cambridge University Press, 1996.
Chung, Kai Lai, Elementary Probability Theory with
Grimmet, G. R. And Stirzaker D. R. Probability and Random
Processes Oxford Science Publications, 2nd ed., 1992.
Mansuripur, Masud, Introduction to Information Theory,
Prentice Hall, 1987.
Hankerson, D., Harris G. Johnson, P., Introduction to Information
Theory and Data Compression, CRC Press, 1998.
Schedule and Assignments
First Assignment: due Fri. Sept. 14.
Second Assignment: due Wed. Sept. 26
Third Assignment (ps file) ,
due Wed. 10/3
Fourth Assignment: due Fri. Oct. 12.
Fifth Assignment: due Mon. Oct. 19.
Sixth Assignment: due Mon. Oct. 29.
Seventh Assignment (ps file) ,
due Wed. 11/21
Eigth Assignment (ps file) ,
due Wed. 11/28
Ninth Assignment (ps file) ,
due Fri. 12/14
What is information? It may surprise you that there is a
mathematically precise answer to this question. Claude Shannon, an
engineer and mathematician at Bell Labs, defined information and then
posed three fundamental problems.
Given a ``stream of data,'' design a method to store it
- Suppose that the stream of data is subject to occasional
errors in transmission. Design a coding system to correct the errors.
- Suppose an eavesdropper is able to copy the transmitted data.
Design an encrypting system to make the data unreadable to
I have presented these questions as practical ones by saying
``Design a method.'' Shannon was mainly concerned with the more fundamental
question ``What is the best that we could possibly do?'' For example:
- Suppose we would like to store novels---written in English---on
a computer. What is the best compression ratio we might hope to
- Suppose we transmit a sequence of bits (each bit is 0 or 1) to a
satellite, and on average one in every 100 bits is incorrectly
received. What is the best possible coding scheme to reduce the error
rate to an acceptable level?
In this course we deal primarily with the first question above, which
is the easiest to answer. We will prove Shannon's Noisless Coding
Theorem and other fundamental results about data compression. We
will also look at several compression algorithms. The climax of the
course is the rather difficult Noisy Coding Theorem which answers
the second question above in a theoretical fashion. We will not
discuss algorithms for error correction, which is the subject of
Math 525: Coding Theory. We will say a little about the theoretical
answer to the third question. We will not discuss techniques for
information security, which is touched on in Math 522: Number Theory,
and covered extensively in Math 626: Cryptography.
Information theory is intimately related to probability theory, so we
will develop the basics of discrete probability theory: counting
principles, joint and conditional distributions, Bayes' theorem,
expected value of a random variable. We will also develop the main
theorems of probability theory, the Law of Large Numbers and
the Central Limit Theorem, since they are required to establish
the Noisy Coding Theorem.
We will have regular assignments and a final exam.
For the assignments, there will be a small number of problems
(10 or so) which you should write up carefully. There will also be
opportunities to program data compression algorithms for credit.