Chapter 1
Introduction:
How do people gain information about the world around them? Centuries ago news was passed only by word of mouth, often in song. As time passed and the written word was no longer reserved for the wealthy or the priviledged news could be disseminated in print. As technology made the printed word more available it became the primary vehicle for the dissemination of news. The next steps for dissemination came with the ability to communicate rapidly, over great disances, via wires.
As technology marched on, we were eventually able to reliably transmit voice information, now without wires. Radio made possible the one-way, widespread dissemination of information and paved the way for a new era in mass communication. The obvious evolution to television again revolutionized the way people learned about the world around them.
Today there is a new revolution in news dissemination and the technology driving it is already well established in many industrialized nations. The revolution is the result of the World Wide Web and the technological advances in the transmission of information that accompany it. This digital age is changing how people are receiving and sending information on all levels. The changes are coming fast and furious for the industrialized nations and will eventually affect the whole world.
Argument 1
Digital technology will be the foundation of for all news information.
The transmission of multimedia data is made possible by the digitization of the visual and audio information. This is a key factor for consideration. This digital conversion translates the information into a binary format. This allows the data to processed by a computer just like other binary information. More importantly, the data can be distributed like any other binary file. No matter what the bits represent they are still just a string of numbers. There is no difference between the coded information of text, sound, or video. The bits can be distributed together and interpreted and processed by the receiver. These packets of information have what are called "headers" that tell the receiver what kind of information it is and how to process it. As a result, a video or multimedia presentation can be acquired anywhere a properly equipped computer or receiver can be connected to a telecommunications system.
Argument 2
The World Wide Web: The News Source for the World.
Whether it is widespread dissemination or interpersonal communication, the World Wide Web is forcing the world to change the way it communicates. As more people are connected to the Web the popular media machine will have to adapt. This means that information providers are going to one day to have to utilize the Web as fundamental part of their dissemination strategy. The new incarnation of information providers will materialize first in countries that have the infrastructure and welcome advances in technology.
“In a peculiarly American way we have often sought technical solutions to social problems. Indeed, more than anything else, this tendency defines American information culture…. And so we attach an enormous importance to new machines, especially information machines, layering them with all our hopes and dreams” (Lubar, 1993, p. 10).
American society openly accepts and often encourages technological advances in communications. This usually means that society makes incremental adjustments as some new device is unveiled that makes communication easier, better, clearer, or more fun. In some cases the device becomes an unexpected status symbol as with the cellular phone or pager.
The world wide web is much more than a new gadget. It is a method of transmission. This makes it more significant but, the there are other recently introduced methods of transmission that were heralded by some to revolutionize communications and have not. Since Direct Broadcast Satellite television was introduced in the United States it has gained a respectable degree of acceptance but it has not greatly called into question the future of mass communications and the news media the way the Web has. DBS was just a new way to present the same thing. It can offer more channels but it is still broadcasting. This means a one way flow of information.
The world wide web is a two way form of communication that, as stated in its name, is world wide. It allows individuals to interact from anywhere on earth that there is a connection. As a result, the communications industry will never be the same and the changes that have come about are only the beginning.
Many will argue that this great revolution will not be possible because of limited bandwidth. For the short term this may be true but, in the not too distant future bandwidth will no longer be an issue of great concern. The advances that are being made with regard to increasing bandwidth and compressing data will help systems worldwide overcome data transmission delays.
Argument 3
The powerful new tools of the Web will make sensory rich, multimedia presentation an essential part of the dissemination of information.
The advances in technology that push the limits of bandwidth are those in the presentation of multimedia on the Web. Everyday more capabilities are added through the advancement on new cybercasting techniques and software. That which was thought impossible a few years ago is a reality today. The web can already support the transmission of high quality audio with limited bandwidth and not so high quality video streams, as well. This is only the beginning of a powerful drive to achieve richer distributed information on the web. The future for development in this area is strong and well funded.
Argument 4
“Netronews” is going to be the new paradigm for the dissemination and presentation of news information.
There are fundamental changes coming that will change the methods of information providers in every aspect. As technology races forward the limitations and capabilities of the Web change. News media will have to make adjustments to meet the current constraints of the medium and the new demands of the audience. The information providers will have to temper the richness of their presentation to facilitate the limitations of the medium.
The new role of the news provider will not only be to provide information but to facilitate the learning process of the user. The limitations of traditional news media on space for a story or time constraints will change. A system of archiving stories and indexing them for the user will become a common part of the services offered in the Netronews era.
There is a need to illustrate the changes that are coming for way people will receive their news as a result of the advance of digital technologies. The chief catalyst for the new news media in this digital world is the World Wide Web. Why should we care? This future news media experience is going to shape the way people see the world around them. The world’s knowledge and understanding of current events will shaped by it; for better or worse.
Chapter 2
Digital Technology: The Basis for Modern News
Sampling and Coding
In order to discuss how the web has changed and will continue to change communications it is necessary to have an understanding of what digital technology is. The basis for digital technologies is ones and zeros. This means that all digital information is binary. How does this happen? At the most basic level the information is coded this way. In computing, text is converted into binary form by asigning a value to each character using a string of ones and zeros. A standard binary code for the representation of text was established called ASCII. ASCII stands for the American Standard Code for Information Interchange. The establishment of a standard protocol for text recognition was essential for the creation of person to person, textual based communication, on networked computer systems. The ability for computers to interchange data regardless of the type of computer system was fundamental in the rise of modern communications technology (Burger, 1993).
The process is a little more difficult when dealing with sound or video. The visual and auditory information found in nature is analog. This means that the information is made up of continuous signals that have smooth fluctuations. The sound or light can be represented by waves. That which we see is made possible by light waves and that which we hear is created by sound waves. These waves can be digitized after they go through a process of conversion from analog to digital.
Like ASCII text, audio and video information is also coded for digitization. In this case the information is not assigned a random code but it is sampled. The sampling process involves recording the frequency and amplitude of waves at a set
Figure 2.1Figure 2.2
interval. Sound and video are composed of waves. And these waves are sampled at consistent intervals so that the waves can be reproduced. The rate of sampling needed to create a satisfying visual or audio playback is based on the limits of human perception and the limits of the hardware. The intended use or audience is also taken into account. The faster the sampling, the more storage needed for the data and the more elaborate the equipment.
To faithfully reproduce a sound or moving image the rate at which the information must be sampled is twice the rate of the highest frequency to be digitally represented. This is known as the Nyquist Theorem. What this means is anything that is sampled at a slower rate than two times the highest frequency may suffer perceivable quality loss. In some cases a slight loss of quality is acceptable when considering limited storage or bandwidth (Burger,1993).
CD audio is sampled at a rate of about 44,000 times a second (44.1kHz). This creates the illusion of seamless sound waves. This is a fast sampling rate that is designed for the highest quality with little regard for storage space. This type of sampling rate is often not practical for computer multimedia applications but satisfactory sound can be achieved at slower rates (Holsinger, 1994). The creation of new data streaming methods is eliminating these limiting factors and will be discussed in a later chapter.
The Advantages of Digital Communication
Once digitized, the information is binary. Since the information is based on numbers it can be manipulated like any set of numbers. These numbers are what we know as bits. These bits are what computers process, receive, and transmit. These simple bits have already changed communication as we know it.
One of the characteristics that makes a bit so valuable is its ability to travel. Any one who has used Email has sent ASCII coded bits to another location. Bits can travel through a variety of media and they can travel quickly: The speed of light to be exact(depends on the medium). When compared to analog signals this looks pretty good. This is actually just one small part of the list of advantages to digital data.
Analog signal transmission—as in traditional television, radio, or phone transmission—is subject to signal attenuation. It is the nature of the waves being transmitted to be susceptible to interference. When the data is transmitted digitally some data can be lost but, it is much more reliable. In addition, information can be added to the signal to correct any errors that result during transmission. “On your audio CD, one-third of the bits are used for error correction. Similar techniques can be applied to existing television so that each home receives studio-quality broadcast—so much clearer than what you get today that you might mistake it for so-called high definition.” (Negroponte, 1995, p. 17) This sort of data correction exists on the web to try to ensure that information gets to its destination without loss.
The transmission of digital data does not require as much bandwidth as analog signals. This means that information can be sent down a smaller pipe. An example of this is the difference between an analog television signal and a digital. The spectrum allocation for one channel of analog television can be filled with several digital channels that can offer a better picture. On the web, the rules of bandwidth are the same. The amount of time necessary to receive all of a file is dependent upon the size of the file and the available bandwidth. The ability to compress the data helps to make digital information very valuable.
Consider this information in a bigger sense. Anything that can be represented by bits can now be distributed reliably around the world at amazing speed. The dissemination or transport of this information is inexpensive. And if the information is traveling via the web it never has to leave the atmosphere to find a satellite. This makes digital transfer very attractive for all sorts of applications, especially to those in the information business. The transmission process is only limited by the speed of the processor, capacity of the transfer media, and available bandwidth. If the amount of data can be reduced then faster transmission can be achieved. Hence the creation of data compression which will be discussed later. First, lets discuss how this grand network we call the world wide web got started.
Chapter 3
The World Wide Web: The News Source for the World.
The History of the Internet and the World Wide Web
All of these discussions about digital information are important because all of the information on the web is digital. The world wide web is the network through which computers around the world can send or receive these digital signals. It all started here in the United States as military project. In 1968 the Department of Defense was in need of a communications network that would be able to function even if portions of the network were suddenly eliminated. It was the height of the cold war and the U.S. military needed a communication method that could withstand a nuclear attack. The solution was ARPANET (Advanced Research Projects Agency Network) (Kroll, 1992). This network utilized packets of information that could reliably find their way to a destination. The key was that the protocol used by the network would be able to detect a path that would get the information there reliably.
The first nodes were set up in 1969. Much of the work on the network was being done by computer scientists and engineers at universities so these nodes were set up in the universities (Prater, 1994, p. 163). The decentralized architecture of the network made it relatively easy to add more computers and expand the network. The network grew quickly and found its greatest use among researchers and universities. By 1983 the network had grown enough to split the network into two separate ones, one for research and education, the other for the military.
The fundamental factor in the explosive growth of the ARPANET was the creation of a standard protocol for the transmission of data. The TCP/IP (Transmission Control Protocol/Internet Protocol) set the standard and made it possible for different computers to communicate across vast networks. The value of this communication medium led to the creation of the NFSNET. The National Science Foundation invested money to connect universities and research centers making TCP/IP the standard for the Internet (Prater, 1994, p. 150).
“The technology and networks were adopted by other government agencies and countries, as well as the private business sector. Today, Internet technology and the Internet have found massive acceptance and use by hundreds of thousands of organizations around the world…. As of 1 Feb 1995, the Internet consisted of more than 50,000 networks in 90 countries. Gateways that allow at least Email connectivity extend this reach to 160 countries. At the end of 1994, 5 million computers were indicated as actually reachable - with an estimated total of 20-40 million users. Network growth continues at around 10 percent per month.”[1]
In the late 1980’s the joining of the Internet with other similar networks around the world made the Internet even more valuable. It enabled researchers around world to share their findings. The problem was that it was still reserved for the computer literate. The interface was not user friendly and was usually just text requiring a knowledge of assorted computer language commands. For the Internet to realize its full potential a new system would have to be developed. It came through the European Particle Physics Laboratory (CERN). In March of 1989 a scientist named Tim Berner’s-Lee proposed to CERN a project that would allow researchers to read each other’s work over the Internet. Berners-Lee proposed a new language that would include hypertext. This Hypertext Markup Language (HTML) is the language that web pages are written in today. The Hypertext Transfer Protocol (HTTP) was created as the standard to handle these new documents (Magdid, et. al:,1995, p. 9)