The period of time immediately following World War II was incredibly ripe for technology. The wartime impetus behind radar led to microwave radio transmission. The need for secrecy gave rise to digital revolution in communications that still continues today. Information theory, the digital computer, and the transistor were all invented in these few short years. If the theory of punctuated equilibrium--that is, evolution in stepwise bursts--can be applied to the history of technology, this fertile period would appear to be the perfect example of such an evolutionary burst.
Shannon himself was able to indulge his growing interest in deriving a theoretical formulation for information through his wartime work in cryptography. As a boy he had been captivated by Edgar Allan Poe's "The Gold Bug," a story that has fascinated the youth of several generations. In his college years at Michigan and MIT, Shannon had read a 1928 paper by R. V. L. Hartley (the same Hartley, incidentally, who invented the best-known vacuum tube oscillator) dealing with the transmission of information, which Shannon said later had been an important influence on his life. At Bell Labs in the years between 1940 and 1945 Shannon began where Hartley had left off, working on information and communications and using the application to cryptography as a way of legitimatizing his work. The first mention of the phrase "information theory" occurs in 1945 memorandum entitled "A Mathematical Theory of Cryptography." Curiously, this phrase never is used in his famous 1948 paper, which became the cornerstone of the new field of information theory.
Shannon's information theory is a philosophy of information from the point of view of communications. It is seldom prescriptive. It gives us a mathematical measure of information and of the informational capacity of a communications channel. Its central result is a theorem about the transmission of information over a communications channel, a result that has served as an inspiration to communications designers now for almost half a century. Shannon's genius lay in exposing a new way of thinking about information and communication. He pointed in a direction and set out the bounds within which one must stay. By now the road has been well traveled. The Transactions on Information Theory, the principal journal in the field, is filled with mathematical, abstract papers on esoteric problems with long titles--indicative of the degree of specialization that has been reached.
Information theory is primarily concerned about the mechanics of information handling. In the introduction to Shannon's paper we find the following:
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
Thus, for the most part, information theory is not concerned with the meaning or the value of the information it describes. A bit of information could represent the chance throw of a coin--a head or a tail--or in the World War II era it could have represented the choice of location of the Allied invasion of France, whether it would be the coast of Normandy or Calais. In the eyes of information theory the same amount of information would be involved in either case, that is, the same storage capacity or the same transmission capacity would be required. Surely, you might argue, there is a great deal of difference. The chance flip of a coin might be of no consequence whatsoever, yet Hitler's foreknowledge of the location of the Allied invasion might have changed the course of the world.
In some respects I would agree that the disregarding of what Shannon called "meaning" is a philosophical deficiency of information theory. Yet I would despair of producing useful concepts in a theory that accounted for a property that is as difficult to measure as meaning, or, what is a consequence, value. Information theory considers information in much the same sense that we might study money in terms of the size and weight of the paper on which it is printed. Using such a theory we might derive the size of truck needed to transport our currency or the vault space required for its storage, but we would not be concerned with the fluctuations of the exchange rate or the effects of inflation on the intrinsic value of our paper certificates. Nevertheless, we shall see that Shannon's theoretical viewpoint provides deep insight about the generation and interpretation of information--insight that often borders on the questions of meaning and value.