In the 16th century science progressed at the pace of the postal system. Often it would take six months for one scientist to learn of the ongoing results of another. The great Danish astronomer Tycho Brahe carefully meted out his results, and it was only after his death that Johannes Kepler was able to inherit the observational data he needed to confirm his hypotheses. Yet today astronomers post data on the web for instant worldwide access, and even routinely manipulate telescopes remotely through the Internet. Communications has always been, if not the heartbeat, the circulatory system of science. Yet even as progress in communications speeds the progress of science, there is a recursive relationship in which science improves communications. There is both a communications of science and a science of communications.
Today communication is almost synonymous with electrical means – radio and television, the telephone and Internet. Yet science itself far predates the electrical era of the last century. In fact, it can be argued that science, defined as knowledge of the world of nature, even predates the evolution of written language, the fundamental underpinning of all human communication. The builders of Stonehenge demonstrated a remarkable knowledge of engineering and astronomy around 3100 BC. This is almost the same date attributed to the beginnings of the Sumerian written language, while the first syllabic language, Linear B, dates from about 1400 BC.
the development of written language, the next stage of communications was the
emergence of postal systems. There
are historical references to postal systems as early as 2000 BC in ancient Egypt
and in China about 1000 BC. These
ancient systems invariably employed relays of messengers, either mounted or on
foot, and were a natural outgrowth of the need for administration of states and
kingdoms. The most celebrated
ancient postal system was the cursus
publicus, which is credited with holding together the vast Roman Empire over
a number of centuries before its disintegration.
a postal system provides the rudiments of individual communication, it does not
enable broadcast or widespread knowledge dispersal. For that purpose, books have been a remarkable social
instrument, lasting in essential form throughout the ages from almost the origin
of written languages until today. There
are early books on clay tablets in Mesopotamia and Egyptian papyruses that date
from the third millennium BC. Without
a system for access, however, the effectiveness of books would be limited, and
hence the importance of the rise of the library.
This, too, was an ancient innovation, dating back to at least the third
century BC with the founding of the most famous library of antiquity, the
Library of Alexandria.
had existed for two thousand years before the invention of movable type and the
publication of the Gutenberg’s Bible in 1455.
This invention may be, after the conception of written language, the most
important event in the history of communications.
The printing press made possible the widespread dissemination of books,
which had previously been the exclusive property of the rich and powerful.
The subsequent history of science is filled with books of singular fame
and influence, such as Newton’s Principia
Mathematica and Darwin’s Origin of
Species. Even in the fast and
transient world of today, science is still marked by the publication of a
succession of celebrated books.
books, magazines, journals, and the post have made up the physical
infrastructure for communications in science over the centuries, there is, in
addition, a social infrastructure necessary for effective communications in any
community such as that in science. It
may seem remarkable in reading histories of science that the scientists working
in a particular field always seemed to know each other and to correspond
regularly, in spite of the seemingly insuperable barriers between countries and
continents in the ages before airplanes and radio. In fact, the world of a particular science has always seemed
to be virtually a small place. The
rule of six degrees of separation has been especially effective in science,
where fame and acceptance of one’s views by peers have been powerful
motivating forces. Even today the
millions of scientists worldwide naturally subdivide themselves into specialties
where everyone knows everyone else. It
is still a small world.
social infrastructure for the communications of science was first formalized
with the founding of the Royal Society in London in 1660.
This provided a venue and process for the discussion of scientific
issues. Its publication, Philosophical
Transactions, begun in 1665, was one of the earliest periodicals in the
West. Its members, such as Isaac
Newton, Edmund Halley, Robert Hooke, and Christopher Wren, are remembered today
as the giants of scientific history. In
contrast, science today is littered with societies and journals in which most of
the names on the mastheads of publications are relatively unknown outside of
their own fields of specialization.
in the Age of Electricity
The science of communications generally is understood to begin only as late as 1837 with the invention of the telegraph by Samuel Morse. As with many great inventions, the genealogy of the telegraph is complex, and the attribution to a single inventor at a particular moment in time is a great oversimplification. Curiously, Morse was a professor of painting, and today has some of his art works exhibited in the world’s leading museums. Moreover, his invention of the Morse code, a relatively efficient representation of the English alphabet, presaged an outpouring of theoretical studies of coding in the field of information theory nearly a century later.
The significance of the telegraph was that it severed the pre-existing bond between transportation and communications. Before the invention of the telegraph information could move only as fast as the physical means of transportation would allow. It took weeks before what was known in Europe became known in America. The telegraph enabled a kind of space-time convergence that brought countries and continents together, causing profound effects on economics and government policies. At one stroke the life force of science – information – was freed of its leaded feet and allowed to fly at the speed of light. Conceptually, at least, advances in communications since then have been simply to increase ease of use.
fact, the telegraph was awkward to use. The
user experience was not much different than that of the postal service – the
need to write out carefully crafted messages and take them to the nearest
telegraph office. In contrast,
Alexander Graham Bell’s telephone brought interactive and easily accessible
communications directly to the user. At
the time of Bell’s invention in 1875 the United States was already densely
wired with telegraph lines, and the Western Union Company was one of the largest
corporations on earth.
The telegraph, of course, was undone by the telephone. Bell’s invention was basically the idea of analog -- that is, the transmitted voltage should be proportional to the air pressure from speech. In that era it was the right thing to do. After all, the world we experience appears to be analog. Things like time and distance, and quantities like voltages and sound pressures seem to be continuous in value, whereas the idea of bits – ones and zeros, or dots and dashes – is an artificial creation, seemingly unrepresentative of reality.
For over a century thereafter, the wires that marched across the nation would be connected to telephones, and the transmission of the voiced information would be analog.
in telephony had mainly to do with extending the coverage across the nation and
meeting the social goal of universal service.
The functionality of the telephone remained almost unchanged until the
latter part of the twentieth century.
the pace of science was immediately changed by the telegraph and telephone is
difficult to determine. The
handling of the British expeditions during the solar eclipse of 1919 to measure
deflection of light at the sun’s perimeter, and thus verify Einstein’s
predictions based on his general theory of relativity, illustrates how the pace
of science continued at a postal rate, and yet could on occasion be electrified.
Although the eclipse was on May 29 of that year, Einstein had still not
heard of the results as late as September 2, at which time one of his many
letters complained of his not being informed.
Then on September 27 he received a telegram from Lorentz saying that
Eddington had found the predicted star displacement.
On November 6 there was a famous meeting of the Royal Society proclaiming
the results, and because of the telegraphed news of that meeting, Einstein awoke
in Berlin on the morning of Nov. 7 to find himself instantly famous in the
newspapers of the world.
the telephone companies were wiring the world, there was a parallel evolution of
wireless media beginning with the dramatic experiments in radio transmission by
Gugliemo Marconi in 1894. Marconi
had made the theoretical work by James Clerk Maxwell a practical reality with
his spark transmitter and coherer receiver.
In 1901 he defied the current understanding of line-of-sight transmission
by successfully transmitting a signal across the Atlantic Ocean from St. Johns,
Newfoundland to Cornwall on the English coast.
At that time there was no understanding of the importance of the
reflection of the ionosphere in radio propagation.
Viewed in retrospect this was the kind of remarkable achievement
sometimes made by determined amateurs who refuse to accept expert opinion.
one of those curious juxtapositions of historical incidents, the sinking of the
Titanic was an event that propelled radio into the public limelight and caused
the subsequent government regulation of the radio spectrum.
The faint radio signals from the spark transmitter on the Titanic were
picked up by the young David Sarnoff, sitting in his small office above
Wanamaker’s Department Store in Manhattan that fateful night of June 18, 1906.
Sarnoff went on to head RCA and to play a major role in the development
of both radio and television.
was certainly the most influential communications medium of the last century.
There came to be more television receivers in the world than telephones,
and television broadcasts were a purveyor of culture to the furthest corners of
the earth. The popularity and
politics of science were affected by this new medium.
The television of Neil Armstrong stepping on the moon on July 20, 1969
was a singular event in history, when people throughout the world were unified
in their celebration of science. Moreover,
educational television made heroes of individual scientists, such as Jacob
Bronowski in Ascent of Man, but
especially Carl Sagan in Cosmos.
Sagan became the very icon of science, and with his death in 1996 the
last popularly known scientist disappeared from public view.
the twentieth century there was a steady exponential increase in the
transmission capacity of telecommunications systems. What began as a single conversation on a pair of copper wires
ended as several hundred thousand conversations being carried by optical pulses
over glass fiber. Along the way the
copper wires were first supplanted by microwave radio relay systems, using the
technology developed for radar during World War II, incorporating the klystron
and magnetron generators first developed in England. However, the microwave radio systems were soon surpassed in
capacity by transmission over buried coaxial cables. Then about 1960 Bell Labs began the development of a
millimeter waveguide system that was projected to meet all the demands for
communications capacity well into the next millenium.
millimeter waveguide system used a cylindrical metal pipe of about 5 cm
diameter, with the transmission mode such that the electrical field was zero at
the inside circumference of the guide. In
theory there was no loss with distance in such a mode, but that only held true
as long as there were no bends or imperfections.
The capacity of this system was considered in those days enormous, and
such capacity would be needed for what was considered to be its dominant usage
– the Picturephone, undergoing final development in the same time period of
the late 1960s. Needless to say,
neither of these major developments ever made a commercial success.
The Picturephone, introduced commercially in 1971 was a market failure.
On the other hand, the millimeter waveguide system was abandoned when it
suddenly became evident that optical fiber offered higher capacity and lower
laser had been invented by Schalow and Townes in 1958.
At first its use for communications had been envisioned as modulated
beams in free space, and at the time of the millimeter wave development
consideration was being given for guided optical beams using lenses within that
pipe. But Charles Kao in 1966 had
studied guided optical wave phenomena and predicted the emergence of low-loss
optical fiber, a prediction that was fulfilled by a team of materials
researchers at Corning in 1970. That
development changed the world, and by the end of the century the vast majority
of all long distance transmission was over optical fibers at data rates
approaching about a terabit per fiber.
technological seeds of the information age were sown in a fertile period
directly after the end of the Second World War. In 1947 the transistor was invented at Bell Labs by Shockley,
Bardeen, and Brattain, while at the same time the ENIAC computer was being
constructed at Penn by Mauchly and Eckert out of about 19,000 vacuum tubes.
Claude Shannon at Bell Labs was writing a landmark paper on a theory of
information, and telecommunications engineers were just beginning to consider
the possible advantages of digital transmission using pulse code modulation,
which had been thrust into the limelight by the need for encrypted transmission
during the war.
first the transistor was seen as a direct replacement for the vacuum tube,
albeit smaller and requiring less power. But
it was the first step on a pathway that constituted possibly the most
significant technological development of the century.
In 1958 Jack Kilby at Texas Instruments fabricated a chip with several
transistors on a substrate, and the integrated circuit was born.
Perhaps the transistors themselves were less important than the ability
to fabricate through photolithography and thus mass-produce on a micro-scale the
wires and passive components that connected the active devices. Following Kilby’s work was an inevitable evolution from
calculator chip to microprocessor to personal computer.
1965 Gordon Moore, one of the founders of Intel, made an observation that has
since become fundamental to business planning about technology evolution.
This observation, called Moore’s Law, states that there is a factor of
two improvement in semiconductor technology every 18 months.
This steady exponential improvement in the cost effectiveness of
integrated circuits has maintained its pace almost exactly for the last three
decades. Since 1965, this represents a gain of 2**24, or about eight
orders of magnitude. Compare this
with, for example, the difference between walking and flying in a jet plane,
which is only two orders of magnitude. This
unprecedented scaling of technology has been the irresistible force that has
driven the digital revolution.
course, Moore’s Law is not a law of physics; it is merely an observation and
cannot be derived from physical principles.
It is remarkable that, first, it predicts exponential progress in
technology, and second, that the exponent of this progress is a constant.
The particular constant doubling period of 18 months seems balanced
precariously between stagnation and chaos, at just about an amount of controlled
obsolescence that is economically sustainable.
Gordon Moore himself has suggested that his “law” is a
self-fulfilling prophecy – that all semi-conductor manufacturers, knowing the
required rate of progression, throw all necessary resources into meeting the
seems conceivable that technological progress always has been exponential, but
that Moore’s Law brought it to our attention because an exact measure of
progress – the dimensions of circuit features – became possible with the
advent of microelectronics. Progress
in a number of related technical fields is also exponential, with various
doubling periods. For example,
optical capacity doubles every 12 months, Internet traffic doubles every six
months, wireless capacity doubles every 9 months, and so forth.
electronics revolution fostered the development of computers, the rise of
computer networks, and the digitization of information and media.
All of these were powerful themes that, woven together, created the
present digital networked economy. It
is hard to separate these themes and to say where one begins and the other
leaves off. Today’s World Wide
Web has been cited as a counterexample of the well-known thesis that all major
technological developments require 25 years for widespread availability, as was
the case with radio, television, and telephone.
The Web, by contrast, seems to have sprung up overnight.
Yet the Web needed the ubiquitous infrastructure of the Internet, which
in turn required widespread availability of computers, which required the
microprocessor, which required integrated circuits, and so forth.
Certainly, it all goes back to the transistor, though I suspect it is
possible to make an argument that takes everything back to Pliny the Elder, or
some such historical figure.
its impact on how science is transacted, the Internet and World Wide Web have
had the greatest impact of any communications medium since possibly the printing
press. The telegraph, telephone,
and wireless were not different in concept from the postal system, except that
the modern technologies were so much faster.
The postal system, telephone, and telegraph are one-to-one topologies,
connecting a single user to another single, pre-designated user.
On the other hand, radio and television are one-to-many topologies for
the broadcast of a small number of common channels to a great many receivers.
beauty and power of the Internet and Web is that they allow the formation of
spontaneous communities of unacquainted users.
Their topology is neither one-to-one nor one-to-many, but rather
many-to-many. They allow the
sharing of information in textual, graphic, and multimedia across these
communities, and they empower the users within these communities to build their
own applications. It is this
empowerment of the periphery that has opened the floodgates of innovation to
millions, whereas in all previous communications technologies the ability to
innovate and craft new systems and applications was confined to the hands of a
small number of industrial engineers who tended the centralized intelligence and
key idea of the Internet – a simple, common core protocol with intelligence at
the periphery – was the critical ingredient of the culture from which the
Internet arose. In the 1960s
computer communications was seen as the development of modems to enable shared
access to expensive central computers over the existing telephone
infrastructure, which was circuit-switched and designed entirely around voice
transmission. Packet transmission
and the open architecture that characterized the ARPANET at its inception in
1969 had to come from outside the traditional industry.
We are fortunate today that government and academia led this development,
so that today’s Internet is not fragmented into proprietary subnets, but is
instead open to all on an equal basis. It
has been said that the greatest invention of the computer industry was not the
PC, but rather the idea of an open platform.
The Internet did the same for the telecommunications industry.
protocol that defines the Internet, TCP/IP, was written by Bob Kahn and Vinton
Cerf in 1973. Its genius, perhaps
understood better in retrospect, was that it perfectly obeyed the maxim of being
as simple as possible, but not more so. Consequently,
the Internet is often viewed as an hourglass, with the multitude of physical
media at the wide bottom, and the plethora of applications at the wide top.
The two are connected by the narrow neck of the Internet Protocol, IP,
through which all communications must flow.
It is a beautiful, flexible, and extensible architecture.
most important application in the early days of the Internet turned out not to
be access to time-shared computers, but rather the simple email that flowed
between network researchers. Email
today remains a mainstay, but the World Wide Web became the socio-technical
invention that facilitated the kind of communications most important to science.
It is perhaps not a coincidence that the Web came from a physics
laboratory, CERN, where it was pioneered by Tim Berners-Lee in 1989.
Subsequently, the first browser, Mosaic, was conceived at NCSA at the
University of Illinois. This was
followed by the commercial development of browsers and of search engines, which
had originated at Carnegie-Mellon, Stanford, and Berkeley.
All of the ingredients of an information environment to promote
scientific collaboration fell into place.
infrastructure that we have for research and collaboration in science today
through the Internet and Web is rich with power and promise.
Yet at the same time it is often filled with frustration.
Who could have dreamed a decade ago that we would have instant access to
a billion documents from the comfort and privacy of our office or laptop?
What a pleasure it is to do library research today!
However, we all frequently have the experience of starting a search for
some topic, only to get sidetracked into interesting but irrelevant material,
and never finding that which we initially sought.
is a wonder that search engines exist at all.
What a daunting job – crawling the web daily to retrieve and inventory
those billion pages! Using
relevance criteria they typically make available about a third of those pages
for user searches, which are based on an average of only about two English key
words. It is not surprising that
the links they return often overwhelm and under-satisfy the desires of the
searcher. Some of the search
improvements being pursued today include the use of contextual information,
popularity with other users, human categorization, and interactive search
information access dream has often been stated as having the Library of Congress
on line. Unfortunately, the reality
is very different. The web pages
are full of junk and questionable material (as is the Library of Congress, for
that matter), and often do not include the detailed material necessary for
scientific research. Even though
scientists are as a group entirely comfortable with computers and networks, they
have been slow to provide on-line access to their journals.
Of course, this varies across disciplines, but one of the main obstacles
has been the existing publication system, which depends on the sale of magazines
(particularly to libraries) to support its operation, whereas in the Internet a
strong culture has evolved that believes that “information wants to be
free.” There are few instances of
web businesses that have been able to sustain subscription or per-copy fees for
on-line access to their material.
are several other characteristics of the scientific establishment that have
hindered web access to current research results. One is the need for peer review in order to establish some
judgment on the material. In a
world increasingly filled with junk, the guidance of peers as to what is worth
spending the limited resource of personal time upon has become more critical
than ever. Even though web
publication can be nearly instantaneous, peer review still proceeds at human
pace. Another serious obstacle has
been the tenure system at universities, and their reluctance to give full credit
to on-line publication. Nonetheless,
in spite of these obstacles, there are a number of branches of science where,
through list-servs and other mechanisms, fellow researchers exchange their
latest research results nearly in real time.
of science is a collaborative enterprise, and the Internet provides mechanisms
to enable collaboration at a distance. Programs
that implement “whiteboards”, where participants can sketch on a shared
visual space, have been available for several years.
Remote sharing and manipulation of instruments and lab facilities, such
as telescopes and microscopes, is commonly done today.
Video conferencing over the Internet is relatively simple at low
resolution, but higher resolutions and multi-party calls become increasingly
difficult with today’s technology. There
is considerable work being done to establish architectures and protocols for
efficient video broadcast trees. Nonetheless,
it must be said that telepresence is still in many respects a poor substitute
for traditional face-to-face interaction.
the current sociology of the net the notion of portals is popular, where there
is a single website that serves as an entrance to integrate material and links
to a particular field. In science a
good example of taking this concept to the next level is the Biology Workbench
at the NCSA. Web visitors to the
Biology Workbench can access a number of worldwide biology databases, and can
choose among the collected application programs to perform operations on these
databases. Users can view their
results in customized ways, and are unaware of where the applications are
net is also increasingly important for continuing education, the lifeblood of
the scientific profession. A number
of courses are being put on the web, and both universities and commercial
enterprises are getting in the business of packaging educational material for
distance learning. The advantages
are ease of access and the ability to search and speed through material for the
most efficient use of time. Typically,
there would be a “talking head” (which enhances involvement, if nothing
else), graphics for slides, and a moving transcript of the lecture.
is worth reflecting upon the momentous improvements in the infrastructure for
scientific work that have occurred in the last several decades because of
information technology. The ability
to simulate and graph results, to ask “what-ifs”, to share results
instantaneously, to search the world’s information archives, and to be
educated at one’s own pace remotely – what a pleasure it is to work in
science today, and how far we have come from the days of libraries and
futurist predicted the emergence of the World Wide Web.
Yet in retrospect it seems to be an obvious idea.
This epitomizes the way information technology evolves.
While the underlying technology trends – the exponential increases in
processing power and bandwidth – are predictable, the applications are not.
While the technologists speak of bandwidth and dream of video, society
chooses email and the web. The ways
in which we use information processing and telecommunications appear to emerge
from the chaos of the social milieu.
we are soon headed to a time when bandwidth and processing will be unlimited and
free. Latency will approach the
speed-of-light delay, which will be seen to pose a problem.
Service quality will approach the “five nines” (99.999%) availability
of the traditional telephone network. Encryption
will finally be freed of its political restraints, and security and privacy will
user access – the troublesome last mile – will continue to be the
bottleneck. Even in today’s
Internet II, the gigabit university network, so few users are empowered with the
promised gigabit speed that the potential of this experimental network remains
largely unexplored. Nevertheless,
megabit access speeds will become the norm in the next few years, and soon after
that, tens of megabits, when broadcast-quality video will move to the net.
of the access will be wireless and untethered, as will be our lifestyles.
Except for voice conversations, which will be seen as just another
application of the net, most communication will be asynchronous, happening at
the convenience of the users, who are increasingly turning into communications
nomads. But these technical
problems are relatively simple compared with the socio-technical engineering
required to improve the three dimensions of communications –
human-to-information, human-to-human, and human-to-computer.
improving the access to information both greater digitization of archived
material, as well as better search methodologies are necessary, but science has
always been about the murky chain of converting data to information, to
knowledge, and finally to wisdom, and this process is likely to remain
human-centered. In the second
dimension, human-to-human, the aim of communications has always been to
replicate the experience of a face-to-face relationship at a distance.
There is, however, no reason to believe that this perfect replication is
either necessary or desirable. Conceivably,
by mediating the human interaction, communications at a distance could be even
better than face-to-face.
the communication between humans and computers can be improved by speech
technology, natural language understanding, and machine implementation of common
sense reasoning. However, although
this year of 2001 is the year of HAL from the famous movie, we are nowhere near
HAL’s capabilities. It will
likely be some time before we start talking to computers like friends, and more
time still before we think of computers as fellow scientists.