Published
in the Proceedings of the IEEE
The Proceedings received Green’s visionary paper on future transmission systems a few months before I started my own career at Bell Laboratories. I remember reading this paper when it was first published, and coincidentally had several occasions to revisit it in subsequent years when I tried to draw wisdom from past attempts at projecting the future of telecommunications. It is now an unexpected pleasure to be able to write a retrospective view of this beautiful, if misguided, manuscript that carries such nostalgic echoes to the “golden years” at Bell Labs.
Estill
Green was the first vice president of systems engineering at Bell Labs.
He later became an executive vice president of the Labs, and was retired
shortly before this paper was written in 1961.
He was rather a legendary executive at the Labs, as many were in those
hallowed years. I even discovered
in the course of writing this introduction that he was the author of the
infamous "Green book" that all old Bell Labs employees will remember
as the repository of comparative salary data.
(I had always thought that it was called the Green book because the cover
was green!) I regret to say that I
never had the privilege of meeting him.
Green’s
predictive paper has two main points: he projects the demands for long haul
transmission in the year 2012, and he discusses several technologies that might
meet these capacity demands. Needless
to say, he was wrong on both points. However,
he was not terribly wrong, and as I read the paper over and over, I almost feel
a subliminal knowledge of the true future seeping through the interstices of his
writing.
There
is always a tendency in retrospective evaluations of predictive papers to poke
fun at them and point out how ridiculous they are. I’ve often done it myself, and even here I have to fight
the tendency. The thought that
brings me up short is imagining myself writing such a paper now, projecting
transmission in the year 2050 or thereabouts.
Would I do as well as Green has done?
I am sure that I could not. I
have the feeling that somehow the world today is less predictable than in was in
1961, and that unimaginable progress will take place in the next 40 years.
At least Green came close to imagining what might happen in his own
40-year interval.
This
paper is elegant and old-fashioned in its use of language.
The writing conveys the quaint flavor of a renaissance
scientist-executive. I had to look
up the word “mantic” in just the second sentence (prophetic).
Green gives us the image of computers at that early day
“metamorphosing” business operations, and then adds an apt quote from
Tennyson. Who writes in this style
today? Unfortunately, I cannot
picture a top executive at an equivalent company today writing such an informed
and philosophical paper.
I
feel guilty of violating Green’s observation in the first paragraph, when he
says that a half-century prediction interval “confers a fair degree of
immunity to after-the-fact checkup by present readers.”
Alas, I was a present reader – then – and here I am giving him an
after-the-fact checkup almost 40 years later.
Shame on me.
Green
argues that the demand for communication in the future will be the product of
three factors – population growth, telephone density growth, and increased use
of bandwidth because of ubiquitous adoption of video telephony.
Combining these factors, Green makes the bold prediction that a bandwidth
of as much as 100,000 megacycles (now MHz, of course) might be required on major
inter-city routes in the year 2012. He
also projects that 10GHz would be required on the trans-Atlantic cable.
I
feel sure that engineers of that day would have been awed by his prediction of
100,000 megacycle capacities. This
was a bold prediction indeed, and I believe that it stands well in comparison to
many of the other predictions contained in this 1962 issue of the Proceedings.
So many of the predictions in this issue were incredibly timid or
ill-conceived. They all missed the exponential expansion of technology, but
Green comes closest to a vision here. When
he says that "Technological change bids fair not just to continue, but to
accelerate" I hear the rumblings of the faraway train -- Moore's Law is on
its way.
Green’s
analysis sounds very logical. In
those days, and for decades to come, projecting traffic demand was rather an
exact science. Telephone companies
studied population growth, both in size and in location. Sizing of their carrier routes was carefully allotted, so
there would be enough capacity to meet the projected demand, but not an excess.
Transmission was expensive, and should not be wasted.
Such a methodology served them well then.
Later it betrayed them, when they discovered that their routes were
under-provisioned and new competitors moved in to take advantage of their
parsimony.
In
fact, the population probably did approximately double as predicted, and the
telephone density in the United States did, indeed, approach saturation.
However, the extreme optimism in Green’s projection was based on the
widespread adoption of videotelephony, which would have multiplied the demand by
about a factor of one thousand as compared with voice.
Nonetheless, as we shall see, his projection was still too conservative,
in spite of missing all the impetus that videotelephony would have added.
The
factors that Green missed in projecting demand were as much sociological as
technological. He could not have
foreseen the rise of Internet or the fax machine (which had already been
invented). No one was yet thinking
of cellular telephony or pagers. But
even in traditional voice telephony this was a world in 1961 when long distance
was expensive and only used reluctantly. “Long
distance calling,” was then a phrase that electrified a recipient’s
attention.
Today
the telephone occupies a different place in the social milieu than it did 40
years ago. Travelers carry small
phones in their pockets, and think nothing of talking at any time to any place
in the world. Little inventions
like touchtone dialing, credit card calls, and 800-number calling changed the
perception and use of telephones. The
globalization of business brought a new need for worldwide connectivity, and
deregulation and privatization brought competition for the world’s service
providers. Between competition and
technological innovation, the cost of long distance telephony was reduced by an
order of magnitude. Simply put, we
use the telephone a lot more than Green could ever have imagined!
Videotelephony
The
population growth and teledensity was the easy part of Green’s prediction.
The adoption of videotelephony that he had foreseen, however, did not
happen. It is an important part of
this historical view, because the idea of Picturephone dominated the planning
and technology of that era. Almost
simultaneous with the publication of Green's paper, the Bell System was
demonstrating the Picturephone at the New York World's Fair.
A few years later Arthur C. Clarke featured it in the movie 2001.
As much as Hal, the talking, thinking computer, the Picturephone was the
very symbol of the future that we all knew would happen.
Green
was only one of the many, many futurists through the years who fell prey to the
continuing plague of videotelephony predictions. Even today, 40 years later, the predictive papers are still
saying that videotelephony will be the future.
Perhaps so, but it has not happened yet. I’ve often thought myself that such a prediction is just
too easy. Whenever anyone is asked
to predict the future of communications, the first thought is that after speech,
the next thing must be video. What
else is there? For lack of any
other futuristic scenario, every seer projects the widespread use of video.
In
making such a prediction Green had a tremendous advantage over a futurist of
today – he was uniquely enabled to create, even mandate, the future,
not just predict it. No one today
has this capability. In 1961 the
Bell System had a monopoly on telephony. Bell
Labs was supported by a license contract fee from the operating Bell Telephone
Companies, a fee that was approved by regulators and charged to the consumer as
a portion of their ordinary telephone service.
Thus if Bell Labs wanted to develop the Picturephone, they could, and the
cost could be passed on, if approved by the regulatory authorities.
The future of telephony was theirs to plan and implement.
When
Green wrote this paper it is likely that plans for the development of
Picturephone were underway. The
actual full-scale development took place in the late 1960s. It is even conceivable that this paper itself was
instrumental in deciding to begin development of the ill-fated Picturephone.
Corporate histories are never written in such a way that the true origins
of such projects are evident. Perhaps
even the principals themselves are unaware or forgetful of the hallway
conversations and unremarkable events that one day blossom into a full scale
project. This was the era, however,
when projects could be decreed, irrespective of market demand and cash flow
analyses. The Bell System tried to
do what it thought was right -- and that was not all bad by any means.
In
1972 the Bell System did indeed unleash the newly developed Picturephone into an
expectant market. Everyone with a
Picturephone would require a 6 MHz analog channel. Coincidentally, just about this same time the first digital
carrier, T-1, was being deployed for intra-city transmission within large
metropolitan areas. However, the
long haul world even at this comparatively late date was almost exclusively
analog. It would seem that after
the first decade, Green’s predictions were right on course.
This
is when things started to go seriously bad for his predictions.
Picturephone failed, and videotelephony was set back for at least a half
century. Little noticed by the
engineers at Bell Labs, scientists at Corning were about to unveil the first
practical optical fiber that would promise unlimited bandwidth and change the
world of transmission irrevocably. At
the same time, and again comparatively little noticed at Bell Labs or elsewhere
in the telecommunications industry, Arpanet was beginning to take shape – the
unforeseen monster that would eat all of Green’s predicted bandwidth demand,
and more. It will be necessary to talk more about each of these events,
but it is curious that they all occurred at almost the same time.
Assessing
Green's Capacity Projection
How
well did Green do in projecting a traffic demand of 100,000 MHz on major
inter-city carriers? Well, to begin
we note that a great clue to the change in character of today's network is that
we no longer talk in terms of bandwidth, but instead we speak of bit rates.
Conservatively, Green’s 100,000 megacycles becomes 100 Gbps, or the
equivalent of 10 OC-192 channels on an optical carrier.
Since wavelength division multiplexing will soon allow as many as 100
channels to be placed on a single fiber, by one measure Green would have
predicted a traffic requirement of only one-tenth of the probable ultimate
capacity of a single fiber.
Unfortunately,
the actual capacity today can no longer be quantified as easily as it was in
1961. Then the Bell System was the
only long haul carrier, and its routes were as etched in stone as the interstate
highway system. It was easy to talk
about the capacities of such routes as the northeast corridor between Boston,
New York, and Washington. Today
there are many competing companies, and their routes criss-cross each other in
evidence of varying priorities and strategies.
Moreover, the capacities are constantly changing as new wavelengths are
added and unlit fibers are put into service.
There are large new companies, notably Level 3 and Qwest, who view their
job as “digging up America.” An
interesting statistic that was true in 1992, and is probably surpassed today, is
that optical fiber was then being laid at a speed, in fiber miles per year, that
exceeded the speed of sound!
With
those caveats, we might approach the estimation of major route capacities in
various ways. For terrestrial
circuits perhaps the best benchmark is Qwest, which claims to be installing 48
strands of fiber connecting 130 cities with OC-192 (about 10Gbps) capacities.
Obviously, wavelength multiplexing would be added as traffic demand
increased, but the aggregate potential capacity would be in the terabit regime.
I have the strong belief that Estill Green would be shocked at this
knowledge – not because the capacity exceeded his estimate by an order of
magnitude, but by the fact that the fiber belonged to Qwest, rather than
AT&T. That would have been
unthinkable in 1961!
For
the undersea cables the disparity is more evident. Green predicted 10Ghz of bandwidth on the trans-Atlantic
route. Today there is in excess of
100 Gbps from the US to Europe, and there is much more coming very soon.
The announced TAT-14 cable, representing a consortium of providers, has a
capacity of 640Gbps. This cable is due for operational capability at the end of
2000. As an indication of the type
of traffic growth, TAT-14 is projecting that 80% of this huge capacity will be
for Internet usage. Another large
venture, Project Oxygen, vows to connect 78 countries with a capacity of 1,280
Gbps on each segment. In addition
there is a large and growing satellite capacity for these same routes.
Thus
the technology and installed media will have the capability of exceeding
Green’s projections by several orders of magnitude – and this is 12 years
before Green’s anchor date of 2012. What
will happen by then is seen as unpredictable today.
There has been such an explosion of data traffic that the growth rate is
said to be 300% annually. (The
actual growth rate is probably impossible to measure, even if there would be
agreement on what constitutes “data”.)
The enormity of such a growth rate, if maintained, would be that in 12
years today’s traffic would be multiplied by almost six orders of magnitude.
This seems impossible, and that some saturation must occur, but in any
event it seems clear that in the year 2012 today's network will no longer exist,
and will have been swallowed in a sea of vastly larger capacity.
If
instead of looking at the potential capacity of installed media, we estimate the
actual traffic today, the numbers are smaller, and ironically consistent with
Green’s prediction. In the
Internet there are a few hundred backbone routes with 45Mbps capacity, a smaller
number with 155Mbps, and a handful at 622Mbps.
Perhaps on a particular inter-city route there might be a total of from
0.5 to 1.0Gbps of capacity. At an
example major exchange point – MAE-East – there was about 2.0Gbps of input
traffic in the beginning of the year 1999.
In international traffic, the largest carrier, AT&T, is said to have
10 billion out-going minutes of traffic per year.
Allowing for the traffic burstiness, this might require 6Gbps of capacity
-- almost exactly Green's figure for trans-Atlantic capacity.
Another
way to look at this would be to estimate the edge traffic at the periphery of
the national network. Today there
are about 30 million registered computer hosts on the Internet in the United
States. With an activity factor of
0.5, and an average bit rate of 50,000 kbps, this would produce a total traffic
of 750 Gbps. If a major route were
to carry as much as 10% of the total traffic, this would be 75 Gbps, right in
line with Green’s prediction. A
similar figure can be obtained for voice traffic in the US – 150 million
telephones, 10% usage factor, and 64kbps full-duplex transmission, 10% on a
major trunk – which gives 192 Gbps.
So
Green's traffic projections look reasonably accurate (in astronomical units of
orders of magnitude) for the actual traffic, in spite of the fact that his
reasoning was completely wrong. The
three orders of magnitude increase for videotelephony, which did not happen, was
made up by the increase in data traffic, the increased use of the telephone, and
the increased global connectivity.
On
the other hand, Green's figures were intended to represent not the actual
traffic, but the deployment of capacity, and these projections are off by about
one order of magnitude for terrestrial traffic and by two orders of magnitude
for international traffic. Furthermore
the growth rates between now and 2012 will add another several orders of
magnitude of traffic. While the
growth rate of voice traffic, 3-5%, would not have surprised Green, the
incredible growth rate of data traffic – in excess of 100% -- would have been
a complete surprise to the engineer or executive of 1961.
By the year 2012, Green will be seen as far wrong by any astronomical
measuring stick.
Transmission
Technology
In
the world of Bell Labs in 1961 transmission was king, and the big transmission
project was the millimeter waveguide. Prototype
systems were being tested at that time with a buried pipe of about 2 inches in
diameter that carried information modulated on a millimeter wave beam in a
circular polarization mode with an electrical field of zero at the pipe edge.
In theory there was no loss with distance, but that only was true as long
as the mode was perfect. Imperfections
and bends in the pipe created other modes with edge loss.
It was partly the object of research to determine how low these losses
could be maintained in field conditions. I
had always heard that the Crawford Hill Laboratory of Bell Labs had been built
in a long straight line just to enable experimentation with the millimeter wave
system.
The
millimeter waveguide had a bandwidth of about 10 GHz. It would seem that 10 such guides would have been required on
a route with his forecasted 100 GHz traffic. We could imagine a world in which this development had gone
forward, and it seems that today's traffic demands could indeed have been met
with the waveguide system proposed in 1961.
However, as we look now at the near future, the growth of Internet and
data traffic would have been choked by this technology.
Fortunately, we were spared that bottleneck by the emergence of optical
fiber.
It
surprises me that Green gave equal press in his paper to the possibility of
optical transmission. The optical
maser, as it was called then, had just been invented. Green suggests that the maser might be used to transmit light
pulses through a gas-filled pipe similar to that to be used by the millimeter waveguide
("to shield the beam from outside effects, such as rain, fog, refraction,
and inhomogeneities"). Personally,
I remember talk of optical transmission at Bell Labs in these years, and the
reverent analogies to Alexander Graham Bell's photophone.
The idea of optical transmission had historical appeal, as well as a
powerful scientific allure.
About
a decade later scientists at Corning Glass developed the first optical fiber
with an attenuation meeting the magic goal of only 20 dB per kilometer.
Suddenly all the steam went out of the waveguide project.
Engineers and scientists with optical knowledge, formerly on the fringes
of activity, were now in great demand. Within
months all forward-looking work on transmission at Bell Labs was converted to
optical fiber activity. Within
another decade optical carriers were deployed for commercial transmission, and
shortly thereafter AT&T wrote off its entire analog transmission plant.
Green was right in surmising that a guided conduit was needed for optics;
he just hadn't foreseen that that conduit would be a hair-thin, flexible strand
of glass. That small strand of
glass changed the world.
Looking
Forward Now
Today
no one treads the ground that Estill Green covered in 1961.
I see no predictions of the telecommunications (not
"telephone") traffic in the year 2050, or of the technology that will
be used for transmission. No one
company has the charter to plan the evolution of communications, and the
dizzying expansion of technology, driven by Moore's Law exponentials, is seen in
some sense as chaotic.
Telecommunications
traffic today more than doubles annually. It
is inconceivable that this rate could be maintained or surpassed for the next 50
years. Something has to happen;
something has to break. Yet the
penetration of video is still very low, and high definition and three-dimensions
lie beyond. Large populations on
earth have very low teledensities, and continents await Internet connection.
Wireless chips offer ubiquitous inexpensive Internet connection.
Pundits say that everything will be connected to everything else.
Perhaps there is a basic human law that says that there is an insatiable
appetite for bandwidth.
What
people will do with the future bandwidth, however, is a question that we have
learned not to answer. The
telecommunications industry has always been wrong with these visions. In the 1970s it was to be Picturephone; in the 1980s it was
to be Home Information Systems, and in the early 1990s it was to be
video-on-demand. The Internet
sneaked up on the industry, and no one foresaw the web.
The only thing that seems dependable is the "Field of Dreams"
proposition: If we build it, they will come.
But
how will it be built? The answer
today is fiber with wavelength division multiplexing. The "ultimate" capacity of a fiber is believed to
be about a terabit per second, but we have often seen in the past that such
limits are dependent upon assumptions that are later faulted.
But there must be some limit, and when that limit is reached, the answer
to more capacity is more fibers. Is
there anything else? What comes
after fiber? I couldn't venture a
guess. Green was way ahead of me.
He did a beautiful job, even after being second-guessed 40 years later.
Robert
W. Lucky
Telcordia
Technologies
Rlucky@telcordia.com